Oct 07 17:03:15 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 17:03:15 crc restorecon[4556]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:15 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 17:03:16 crc restorecon[4556]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 17:03:16 crc kubenswrapper[4681]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.781344 4681 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791227 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791270 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791279 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791286 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791297 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791306 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791314 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791322 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791330 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791337 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791344 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791350 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791357 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791364 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791371 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791377 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791384 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791390 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791396 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791405 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791413 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791422 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791428 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791435 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791442 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791451 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791458 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791464 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791470 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791478 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791485 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791491 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791497 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791503 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791509 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791516 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791522 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791528 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791534 4681 feature_gate.go:330] unrecognized feature gate: Example Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791540 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791547 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791555 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791562 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791569 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791575 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791582 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791588 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791595 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791601 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791607 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791613 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791621 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791628 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791634 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791640 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791647 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791654 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791663 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791674 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791682 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791691 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791698 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791705 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791713 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791719 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791726 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791733 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791740 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791746 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791753 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.791759 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792801 4681 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792825 4681 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792842 4681 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792852 4681 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792863 4681 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792873 4681 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792911 4681 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792921 4681 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792929 4681 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792937 4681 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792948 4681 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792957 4681 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792965 4681 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792972 4681 flags.go:64] FLAG: --cgroup-root="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792980 4681 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792987 4681 flags.go:64] FLAG: --client-ca-file="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.792996 4681 flags.go:64] FLAG: --cloud-config="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793004 4681 flags.go:64] FLAG: --cloud-provider="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793012 4681 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793021 4681 flags.go:64] FLAG: --cluster-domain="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793028 4681 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793036 4681 flags.go:64] FLAG: --config-dir="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793044 4681 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793052 4681 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793062 4681 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793070 4681 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793078 4681 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793087 4681 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793095 4681 flags.go:64] FLAG: --contention-profiling="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793103 4681 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793111 4681 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793119 4681 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793127 4681 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793137 4681 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793145 4681 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793152 4681 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793160 4681 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793167 4681 flags.go:64] FLAG: --enable-server="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793175 4681 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793186 4681 flags.go:64] FLAG: --event-burst="100" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793193 4681 flags.go:64] FLAG: --event-qps="50" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793201 4681 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793210 4681 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793217 4681 flags.go:64] FLAG: --eviction-hard="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793227 4681 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793234 4681 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793242 4681 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793249 4681 flags.go:64] FLAG: --eviction-soft="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793255 4681 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793261 4681 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793267 4681 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793273 4681 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793280 4681 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793286 4681 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793292 4681 flags.go:64] FLAG: --feature-gates="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793299 4681 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793305 4681 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793312 4681 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793320 4681 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793328 4681 flags.go:64] FLAG: --healthz-port="10248" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793335 4681 flags.go:64] FLAG: --help="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793343 4681 flags.go:64] FLAG: --hostname-override="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793350 4681 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793358 4681 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793366 4681 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793373 4681 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793380 4681 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793388 4681 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793395 4681 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793403 4681 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793410 4681 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793418 4681 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793426 4681 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793434 4681 flags.go:64] FLAG: --kube-reserved="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793443 4681 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793450 4681 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793458 4681 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793465 4681 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793472 4681 flags.go:64] FLAG: --lock-file="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793480 4681 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793487 4681 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793495 4681 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793508 4681 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793515 4681 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793523 4681 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793531 4681 flags.go:64] FLAG: --logging-format="text" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793538 4681 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793546 4681 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793555 4681 flags.go:64] FLAG: --manifest-url="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793563 4681 flags.go:64] FLAG: --manifest-url-header="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793574 4681 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793581 4681 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793591 4681 flags.go:64] FLAG: --max-pods="110" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793598 4681 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793606 4681 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793614 4681 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793622 4681 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793629 4681 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793637 4681 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793644 4681 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793662 4681 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793670 4681 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793678 4681 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793686 4681 flags.go:64] FLAG: --pod-cidr="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793693 4681 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793707 4681 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793716 4681 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793725 4681 flags.go:64] FLAG: --pods-per-core="0" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793733 4681 flags.go:64] FLAG: --port="10250" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793741 4681 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793748 4681 flags.go:64] FLAG: --provider-id="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793756 4681 flags.go:64] FLAG: --qos-reserved="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793765 4681 flags.go:64] FLAG: --read-only-port="10255" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793773 4681 flags.go:64] FLAG: --register-node="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793781 4681 flags.go:64] FLAG: --register-schedulable="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793789 4681 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793802 4681 flags.go:64] FLAG: --registry-burst="10" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793810 4681 flags.go:64] FLAG: --registry-qps="5" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793819 4681 flags.go:64] FLAG: --reserved-cpus="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793826 4681 flags.go:64] FLAG: --reserved-memory="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793836 4681 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793843 4681 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793851 4681 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793859 4681 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793868 4681 flags.go:64] FLAG: --runonce="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793925 4681 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793935 4681 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793943 4681 flags.go:64] FLAG: --seccomp-default="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793951 4681 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793958 4681 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793966 4681 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793975 4681 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793982 4681 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793990 4681 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.793997 4681 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794005 4681 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794012 4681 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794020 4681 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794028 4681 flags.go:64] FLAG: --system-cgroups="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794036 4681 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794048 4681 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794056 4681 flags.go:64] FLAG: --tls-cert-file="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794064 4681 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794073 4681 flags.go:64] FLAG: --tls-min-version="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794081 4681 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794088 4681 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794095 4681 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794103 4681 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794111 4681 flags.go:64] FLAG: --v="2" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794121 4681 flags.go:64] FLAG: --version="false" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794131 4681 flags.go:64] FLAG: --vmodule="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794140 4681 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.794148 4681 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798289 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798319 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798325 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798331 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798337 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798342 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798348 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798353 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798358 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798364 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798371 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798379 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798386 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798393 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798399 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798406 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798412 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798418 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798423 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798430 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798435 4681 feature_gate.go:330] unrecognized feature gate: Example Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798441 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798446 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798453 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798461 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798474 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798481 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798487 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798493 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798498 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798504 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798509 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798515 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798520 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798525 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798530 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798536 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798541 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798546 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798552 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798557 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798562 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798568 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798573 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798578 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798583 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798588 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798593 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798599 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798604 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798609 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798614 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798621 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798626 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798631 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798636 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798644 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798649 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798654 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798660 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798667 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798673 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798681 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798687 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798696 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798703 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798710 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798715 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798720 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798726 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.798731 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.798741 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.808714 4681 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.808757 4681 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808904 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808919 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808927 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808935 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808944 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808954 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808963 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808971 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808979 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808987 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.808994 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809001 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809009 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809016 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809023 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809030 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809040 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809050 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809058 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809067 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809075 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809082 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809089 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809096 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809104 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809113 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809123 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809130 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809137 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809145 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809153 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809160 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809168 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809175 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809184 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809192 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809198 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809206 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809213 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809219 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809227 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809233 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809240 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809247 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809254 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809262 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809269 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809277 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809285 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809291 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809298 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809305 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809312 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809318 4681 feature_gate.go:330] unrecognized feature gate: Example Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809325 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809333 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809339 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809347 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809353 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809360 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809367 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809374 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809380 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809387 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809396 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809418 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809436 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809444 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809450 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809468 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809477 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.809488 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809680 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809694 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809702 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809709 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809717 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809724 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809731 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809737 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809745 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809752 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809761 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809772 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809779 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809788 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809796 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809805 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809814 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809822 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809830 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809838 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809845 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809852 4681 feature_gate.go:330] unrecognized feature gate: Example Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809858 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809865 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809872 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809934 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809942 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809949 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809956 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809962 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809970 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809977 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809984 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809990 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.809999 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810006 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810013 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810019 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810026 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810032 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810040 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810047 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810055 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810062 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810069 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810075 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810082 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810089 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810096 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810102 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810109 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810116 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810123 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810130 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810136 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810143 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810151 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810159 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810168 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810175 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810181 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810188 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810195 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810202 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810209 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810216 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810225 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810234 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810241 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810248 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.810257 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.810268 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.812108 4681 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.817380 4681 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.817511 4681 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.819233 4681 server.go:997] "Starting client certificate rotation" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.819269 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.819542 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 12:13:49.000284866 +0000 UTC Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.819683 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2251h10m32.180607267s for next certificate rotation Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.843224 4681 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.847951 4681 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.862588 4681 log.go:25] "Validated CRI v1 runtime API" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.899830 4681 log.go:25] "Validated CRI v1 image API" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.901699 4681 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.909934 4681 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-16-57-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.910114 4681 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.935590 4681 manager.go:217] Machine: {Timestamp:2025-10-07 17:03:16.931404939 +0000 UTC m=+0.578816574 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7362d865-50da-43c9-b446-61154f28e86f BootID:6b2328d2-1a46-4216-a128-b08f63d29d00 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2d:cb:78 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2d:cb:78 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:30:9b:b9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:66:02:21 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:67:32:7c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b0:71:9d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:40:91:1d:eb:99 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:89:81:4e:e1:07 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.936005 4681 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.936282 4681 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.936808 4681 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.937147 4681 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.937205 4681 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.937516 4681 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.937534 4681 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.938307 4681 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.938371 4681 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.939076 4681 state_mem.go:36] "Initialized new in-memory state store" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.939272 4681 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.942927 4681 kubelet.go:418] "Attempting to sync node with API server" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.942962 4681 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.943025 4681 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.943064 4681 kubelet.go:324] "Adding apiserver pod source" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.943084 4681 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.948415 4681 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.950136 4681 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.952983 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.953057 4681 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.953090 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.953254 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.953318 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955206 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955263 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955284 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955302 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955330 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955349 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955368 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955397 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955421 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955438 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955490 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.955514 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.956671 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.957589 4681 server.go:1280] "Started kubelet" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.958698 4681 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.958847 4681 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.959247 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.959300 4681 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 17:03:16 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.960778 4681 server.go:460] "Adding debug handlers to kubelet server" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.961704 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.961749 4681 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.962462 4681 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.962485 4681 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.962674 4681 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.962471 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 12:18:22.604169228 +0000 UTC Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.963002 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2299h15m5.641172478s for next certificate rotation Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.963586 4681 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.967828 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.93:6443: connect: connection refused" interval="200ms" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.968346 4681 factory.go:55] Registering systemd factory Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.968373 4681 factory.go:221] Registration of the systemd container factory successfully Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.972271 4681 factory.go:153] Registering CRI-O factory Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.972310 4681 factory.go:221] Registration of the crio container factory successfully Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.972408 4681 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.972477 4681 factory.go:103] Registering Raw factory Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.972529 4681 manager.go:1196] Started watching for new ooms in manager Oct 07 17:03:16 crc kubenswrapper[4681]: W1007 17:03:16.972977 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.973085 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977555 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977617 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977632 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977643 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977655 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977667 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977680 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977692 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977707 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977718 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977732 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977745 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977756 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977770 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977798 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977811 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977823 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977835 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977849 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977861 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977890 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977905 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977918 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977929 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977941 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977951 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977968 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977983 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.977997 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978030 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978059 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978071 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978084 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978097 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978110 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978123 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978137 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978150 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978169 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978182 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978195 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978209 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978222 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978235 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978248 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978261 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978273 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978285 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978296 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978307 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978319 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978345 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978361 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978375 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978388 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978404 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978417 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978428 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978456 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978482 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978495 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978507 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978520 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978531 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978543 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978557 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978571 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978594 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978606 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978618 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978630 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978643 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978655 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978668 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978680 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978703 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978715 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978728 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978739 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978751 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978762 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978773 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978785 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978811 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978823 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978934 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978983 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978994 4681 manager.go:319] Starting recovery of all containers Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.978996 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980329 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980347 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980358 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980369 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980378 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980389 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980399 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980409 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980418 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980426 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980436 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980445 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980454 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980463 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980473 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980483 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980501 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980514 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980528 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980541 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980554 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980566 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980579 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980591 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980604 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980617 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980629 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980640 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980651 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980664 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980676 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980687 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980698 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980711 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980722 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980733 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980745 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980756 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980767 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980776 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980785 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980794 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980804 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980814 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980826 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980836 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980845 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980854 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.980864 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.981251 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.981261 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.981273 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: E1007 17:03:16.972619 4681 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.93:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c444035481e81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 17:03:16.957535873 +0000 UTC m=+0.604947508,LastTimestamp:2025-10-07 17:03:16.957535873 +0000 UTC m=+0.604947508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.983984 4681 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984022 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984034 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984051 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984060 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984069 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984079 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984087 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984098 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984108 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984119 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984134 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984145 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984161 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984172 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984183 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984193 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984202 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984213 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984222 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984233 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984243 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984252 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984262 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984273 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984289 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984298 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984308 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984318 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984328 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984339 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984348 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984359 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984382 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984393 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984403 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984413 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984424 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984436 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984444 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984455 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984466 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984475 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984485 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984497 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984514 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984524 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984534 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984543 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984553 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984563 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984572 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984583 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984593 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984606 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984616 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984627 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984637 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984647 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984656 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984667 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984677 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984688 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984698 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984708 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984717 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984727 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984738 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984746 4681 reconstruct.go:97] "Volume reconstruction finished" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.984753 4681 reconciler.go:26] "Reconciler: start to sync state" Oct 07 17:03:16 crc kubenswrapper[4681]: I1007 17:03:16.995621 4681 manager.go:324] Recovery completed Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.003185 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.004700 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.004742 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.004750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.005612 4681 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.005630 4681 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.005651 4681 state_mem.go:36] "Initialized new in-memory state store" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.025157 4681 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.027106 4681 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.027284 4681 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.027378 4681 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.028007 4681 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.028764 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.028831 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.029305 4681 policy_none.go:49] "None policy: Start" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.030148 4681 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.030170 4681 state_mem.go:35] "Initializing new in-memory state store" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.063910 4681 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076137 4681 manager.go:334] "Starting Device Plugin manager" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076200 4681 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076275 4681 server.go:79] "Starting device plugin registration server" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076659 4681 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076669 4681 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.076928 4681 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.077011 4681 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.077024 4681 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.084278 4681 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.128173 4681 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.128285 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.131828 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.131868 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.131900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132055 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132350 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132420 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132974 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.132984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133109 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133219 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133270 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133293 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133311 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133896 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.133910 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134087 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134120 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134235 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134459 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134504 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134751 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134902 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134953 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.134976 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.135941 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.135958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136110 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.135970 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136197 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.135985 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136293 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136464 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.136499 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.137208 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.137231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.137242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.169232 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.93:6443: connect: connection refused" interval="400ms" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.176990 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.177773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.177801 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.177810 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.177828 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.178243 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.93:6443: connect: connection refused" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188343 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188400 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188424 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188469 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188486 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188502 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188520 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188574 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188609 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188639 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188680 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188695 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188710 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.188730 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289374 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289426 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289443 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289460 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289477 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289493 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289509 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289523 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289538 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289556 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289571 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289585 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289602 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289598 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289619 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289658 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289618 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289702 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289745 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289754 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289759 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289746 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289729 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289792 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.289975 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.379369 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.380689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.380732 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.380748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.380782 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.381275 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.93:6443: connect: connection refused" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.469773 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.493053 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.507572 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.510127 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-291a70f765d9f25f860716e9539705fc3f4440117be1059cc75540443805db4e WatchSource:0}: Error finding container 291a70f765d9f25f860716e9539705fc3f4440117be1059cc75540443805db4e: Status 404 returned error can't find the container with id 291a70f765d9f25f860716e9539705fc3f4440117be1059cc75540443805db4e Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.524045 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e5c79c22ff3bb157247e4956c0fc70401d503d8526bdbd78224d7675bea3c899 WatchSource:0}: Error finding container e5c79c22ff3bb157247e4956c0fc70401d503d8526bdbd78224d7675bea3c899: Status 404 returned error can't find the container with id e5c79c22ff3bb157247e4956c0fc70401d503d8526bdbd78224d7675bea3c899 Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.524255 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.525285 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2f824dc83a2548320c8753dd34effd5856e2791291522fd5108d09ef5053b411 WatchSource:0}: Error finding container 2f824dc83a2548320c8753dd34effd5856e2791291522fd5108d09ef5053b411: Status 404 returned error can't find the container with id 2f824dc83a2548320c8753dd34effd5856e2791291522fd5108d09ef5053b411 Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.531914 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.536267 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2c3a765b3f2e21c8b66514e559249aab67192336d9da358757f5a386a63c23d6 WatchSource:0}: Error finding container 2c3a765b3f2e21c8b66514e559249aab67192336d9da358757f5a386a63c23d6: Status 404 returned error can't find the container with id 2c3a765b3f2e21c8b66514e559249aab67192336d9da358757f5a386a63c23d6 Oct 07 17:03:17 crc kubenswrapper[4681]: W1007 17:03:17.550041 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bc0b5358d1edf161a0871b8ac8b28e3bbc633a3593c3925aa58bb8e2a05936a8 WatchSource:0}: Error finding container bc0b5358d1edf161a0871b8ac8b28e3bbc633a3593c3925aa58bb8e2a05936a8: Status 404 returned error can't find the container with id bc0b5358d1edf161a0871b8ac8b28e3bbc633a3593c3925aa58bb8e2a05936a8 Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.570943 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.93:6443: connect: connection refused" interval="800ms" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.781809 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.783314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.783350 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.783388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.783412 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: E1007 17:03:17.783805 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.93:6443: connect: connection refused" node="crc" Oct 07 17:03:17 crc kubenswrapper[4681]: I1007 17:03:17.960796 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.032599 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c3a765b3f2e21c8b66514e559249aab67192336d9da358757f5a386a63c23d6"} Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.034521 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f824dc83a2548320c8753dd34effd5856e2791291522fd5108d09ef5053b411"} Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.035550 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e5c79c22ff3bb157247e4956c0fc70401d503d8526bdbd78224d7675bea3c899"} Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.036268 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"291a70f765d9f25f860716e9539705fc3f4440117be1059cc75540443805db4e"} Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.036844 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc0b5358d1edf161a0871b8ac8b28e3bbc633a3593c3925aa58bb8e2a05936a8"} Oct 07 17:03:18 crc kubenswrapper[4681]: W1007 17:03:18.115022 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.115097 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:18 crc kubenswrapper[4681]: W1007 17:03:18.219440 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.219575 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.371521 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.93:6443: connect: connection refused" interval="1.6s" Oct 07 17:03:18 crc kubenswrapper[4681]: W1007 17:03:18.472980 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.473086 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:18 crc kubenswrapper[4681]: W1007 17:03:18.540866 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.541031 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.584726 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.586985 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.587052 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.587082 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.587126 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:18 crc kubenswrapper[4681]: E1007 17:03:18.587784 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.93:6443: connect: connection refused" node="crc" Oct 07 17:03:18 crc kubenswrapper[4681]: I1007 17:03:18.960211 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.041172 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f" exitCode=0 Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.041249 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.041336 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.042165 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.042195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.042206 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.043388 4681 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98" exitCode=0 Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.043442 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.043461 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.044230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.044263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.044280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.045087 4681 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb" exitCode=0 Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.045132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.045190 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.046073 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.046090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.046097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.048408 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.048451 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.048463 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.048479 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.048503 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049150 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049158 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049594 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544" exitCode=0 Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049616 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544"} Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.049676 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.050261 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.050280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.050288 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.051232 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.052053 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.052082 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.052094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:19 crc kubenswrapper[4681]: I1007 17:03:19.960391 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:19 crc kubenswrapper[4681]: E1007 17:03:19.972209 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.93:6443: connect: connection refused" interval="3.2s" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054289 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a882989a3f781f6b53f804e5087ed976d686ff6590e432f25a2e0537a70fd7d3"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054352 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054368 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054371 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054495 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.054517 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.055303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.055342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.055359 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.056126 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3" exitCode=0 Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.056162 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.056219 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.056957 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.056996 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.057011 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.058093 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.058089 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f498c4ac9c2e2aa10188ede03e77421d502ec718a71af923d84081943350a914"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.058789 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.058812 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.058826 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.061681 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.061714 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.061727 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41"} Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.061740 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.061740 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062718 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062749 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062945 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.062992 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.188029 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.189203 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.189230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.189240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:20 crc kubenswrapper[4681]: I1007 17:03:20.189265 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:20 crc kubenswrapper[4681]: E1007 17:03:20.189670 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.93:6443: connect: connection refused" node="crc" Oct 07 17:03:20 crc kubenswrapper[4681]: W1007 17:03:20.548539 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.93:6443: connect: connection refused Oct 07 17:03:20 crc kubenswrapper[4681]: E1007 17:03:20.548682 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.93:6443: connect: connection refused" logger="UnhandledError" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066661 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339" exitCode=0 Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066740 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339"} Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066832 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066870 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066832 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.066925 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.067660 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.067833 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068305 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068349 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068364 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068432 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:21 crc kubenswrapper[4681]: I1007 17:03:21.068375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077025 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077486 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097"} Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077581 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c"} Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54"} Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077627 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4"} Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077638 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077647 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e"} Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.077688 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078137 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078186 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078860 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.078932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.079181 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.079228 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.079246 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.620403 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.620681 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.622102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.622160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.622185 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:22 crc kubenswrapper[4681]: I1007 17:03:22.847238 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.079575 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.079657 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.081431 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.081509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.081525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.082486 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.082522 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.082543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.390255 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.392445 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.392506 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.392524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.392562 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:23 crc kubenswrapper[4681]: I1007 17:03:23.619165 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.083070 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.084417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.084480 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.084503 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.849706 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.849928 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.851043 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.851086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:24 crc kubenswrapper[4681]: I1007 17:03:24.851095 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:27 crc kubenswrapper[4681]: E1007 17:03:27.084691 4681 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.822018 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.822396 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.824207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.824284 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.824325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:27 crc kubenswrapper[4681]: I1007 17:03:27.824711 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:28 crc kubenswrapper[4681]: I1007 17:03:28.091379 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:28 crc kubenswrapper[4681]: I1007 17:03:28.092784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:28 crc kubenswrapper[4681]: I1007 17:03:28.092815 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:28 crc kubenswrapper[4681]: I1007 17:03:28.092827 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:28 crc kubenswrapper[4681]: I1007 17:03:28.992436 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.000474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.093970 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.095771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.095818 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.095837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:29 crc kubenswrapper[4681]: I1007 17:03:29.101099 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.096202 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.098184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.098273 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.098325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.822786 4681 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.822850 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 17:03:30 crc kubenswrapper[4681]: W1007 17:03:30.918044 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.918183 4681 trace.go:236] Trace[1548668420]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 17:03:20.916) (total time: 10001ms): Oct 07 17:03:30 crc kubenswrapper[4681]: Trace[1548668420]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:03:30.918) Oct 07 17:03:30 crc kubenswrapper[4681]: Trace[1548668420]: [10.001770748s] [10.001770748s] END Oct 07 17:03:30 crc kubenswrapper[4681]: E1007 17:03:30.918239 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.960562 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.978661 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.978810 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.979937 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.980004 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:30 crc kubenswrapper[4681]: I1007 17:03:30.980024 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:31 crc kubenswrapper[4681]: W1007 17:03:31.090348 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.090502 4681 trace.go:236] Trace[2092563211]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 17:03:21.088) (total time: 10001ms): Oct 07 17:03:31 crc kubenswrapper[4681]: Trace[2092563211]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:03:31.090) Oct 07 17:03:31 crc kubenswrapper[4681]: Trace[2092563211]: [10.001934902s] [10.001934902s] END Oct 07 17:03:31 crc kubenswrapper[4681]: E1007 17:03:31.090535 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.098548 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.099454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.099508 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.099525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:31 crc kubenswrapper[4681]: W1007 17:03:31.190645 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.191100 4681 trace.go:236] Trace[1506511518]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 17:03:21.189) (total time: 10001ms): Oct 07 17:03:31 crc kubenswrapper[4681]: Trace[1506511518]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:03:31.190) Oct 07 17:03:31 crc kubenswrapper[4681]: Trace[1506511518]: [10.001594493s] [10.001594493s] END Oct 07 17:03:31 crc kubenswrapper[4681]: E1007 17:03:31.191297 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.293651 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.293716 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.304042 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 17:03:31 crc kubenswrapper[4681]: I1007 17:03:31.304116 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.103189 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.104910 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a882989a3f781f6b53f804e5087ed976d686ff6590e432f25a2e0537a70fd7d3" exitCode=255 Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.104964 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a882989a3f781f6b53f804e5087ed976d686ff6590e432f25a2e0537a70fd7d3"} Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.105155 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.105938 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.105978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.105991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.106487 4681 scope.go:117] "RemoveContainer" containerID="a882989a3f781f6b53f804e5087ed976d686ff6590e432f25a2e0537a70fd7d3" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.527513 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:32 crc kubenswrapper[4681]: I1007 17:03:32.853006 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.111138 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.113262 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492"} Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.113454 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.114666 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.114737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.114760 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:33 crc kubenswrapper[4681]: I1007 17:03:33.122747 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.118435 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.120026 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.123070 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" exitCode=255 Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.123133 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492"} Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.123180 4681 scope.go:117] "RemoveContainer" containerID="a882989a3f781f6b53f804e5087ed976d686ff6590e432f25a2e0537a70fd7d3" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.123252 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.126836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.126957 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.126987 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:34 crc kubenswrapper[4681]: I1007 17:03:34.129448 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:34 crc kubenswrapper[4681]: E1007 17:03:34.129991 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.127247 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.128868 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.129594 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.129777 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.129978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:35 crc kubenswrapper[4681]: I1007 17:03:35.130970 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:35 crc kubenswrapper[4681]: E1007 17:03:35.131370 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.130387 4681 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 17:03:36 crc kubenswrapper[4681]: E1007 17:03:36.292277 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.299312 4681 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.299475 4681 trace.go:236] Trace[1646836430]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 17:03:24.532) (total time: 11766ms): Oct 07 17:03:36 crc kubenswrapper[4681]: Trace[1646836430]: ---"Objects listed" error: 11766ms (17:03:36.299) Oct 07 17:03:36 crc kubenswrapper[4681]: Trace[1646836430]: [11.766897562s] [11.766897562s] END Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.299491 4681 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 17:03:36 crc kubenswrapper[4681]: E1007 17:03:36.299786 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.609256 4681 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.955023 4681 apiserver.go:52] "Watching apiserver" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958042 4681 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958324 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958688 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958725 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958754 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.958786 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.959180 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:36 crc kubenswrapper[4681]: E1007 17:03:36.959202 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.959297 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:36 crc kubenswrapper[4681]: E1007 17:03:36.959348 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:36 crc kubenswrapper[4681]: E1007 17:03:36.959428 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.961928 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.962071 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.962077 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.962665 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.963213 4681 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.963717 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.965542 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.965543 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.967331 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.967813 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.985410 4681 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 17:03:36 crc kubenswrapper[4681]: I1007 17:03:36.998495 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003239 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003434 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003545 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003753 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003851 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003976 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004075 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004273 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004370 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004463 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004552 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004658 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004752 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004858 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004966 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005051 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005120 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005197 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005259 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005322 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005382 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005444 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005507 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005570 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005635 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005699 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005761 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005828 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005908 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005982 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006048 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006113 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006179 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006244 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006310 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006375 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006439 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006505 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006573 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006637 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006701 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006766 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006858 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006967 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007035 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007100 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007165 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007232 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007301 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007370 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007434 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007499 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007564 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007629 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007688 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007764 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007829 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008161 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008207 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008227 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008246 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008263 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008280 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008298 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008317 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008334 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008355 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008373 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008433 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008452 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008467 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008484 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008502 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008519 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008538 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008556 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008574 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008592 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008607 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008626 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008641 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008656 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008675 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008692 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008708 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008725 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008741 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008756 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008771 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008787 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008803 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008818 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008922 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008943 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008962 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008979 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008994 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009013 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009030 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009045 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009062 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009081 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009097 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009116 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009132 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009148 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009165 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009185 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009204 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009222 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009239 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009267 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009286 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009303 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009322 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009339 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009355 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009370 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009386 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009403 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009421 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009437 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009454 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009486 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009503 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009519 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009544 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009562 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009579 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003548 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009599 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009615 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003912 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009632 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009650 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009667 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009684 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009699 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009715 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009737 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009755 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009774 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009791 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009809 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009827 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009844 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009863 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009898 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009915 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009951 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009966 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009983 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009999 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010016 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010036 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010060 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010085 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010109 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010133 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010150 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010186 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010204 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010220 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010237 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010252 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010272 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010289 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010305 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010320 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010336 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010352 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010368 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010383 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010399 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010415 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010432 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010450 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010488 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010505 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010523 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010541 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010560 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010577 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010594 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010611 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010629 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010651 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010667 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010684 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010703 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010721 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010738 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010758 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016414 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016459 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016488 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016510 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016549 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016589 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016609 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016631 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016650 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016668 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016685 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016704 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016723 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016791 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016804 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040152 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.003964 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004625 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004773 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004850 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.004928 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005068 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005471 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005529 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042325 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005600 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005728 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.005951 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006076 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006200 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006527 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006646 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.006858 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.007748 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008090 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008159 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008182 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008136 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008366 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008568 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008569 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008600 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008841 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.008970 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009047 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009190 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009412 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009430 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009480 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009583 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009658 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.009666 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010160 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.010464 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016828 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.016870 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.017017 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.017030 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.020005 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.020186 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.020347 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021084 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021204 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021261 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021284 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021477 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021800 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.021932 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022158 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022286 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022539 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022527 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022773 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.022995 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023058 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023069 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023077 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023276 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023428 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.023744 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.024241 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.024676 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.024861 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.025300 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.025501 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.025581 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.025919 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.026284 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.026414 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:03:37.526390442 +0000 UTC m=+21.173802087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.026587 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.026688 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.026851 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.026970 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.027236 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.027447 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.027602 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.027765 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.028397 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.028418 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.028554 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.028811 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.029173 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.029183 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.029239 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.029271 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031010 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031236 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031245 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031451 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031469 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031541 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031654 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042844 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031747 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.031830 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.032201 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.032212 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.032445 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.036790 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.036972 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.037233 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.037585 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.037657 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.037729 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.039586 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040247 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040563 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040750 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040767 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.040903 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041051 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041098 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041110 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041120 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041516 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041546 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041551 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041602 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041714 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.041747 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042097 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043040 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043041 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042417 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042547 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.042595 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042816 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042835 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042891 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043010 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043195 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043382 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.042347 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043537 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043610 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043718 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.043792 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.044246 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:37.544219359 +0000 UTC m=+21.191630914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.044434 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.044684 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.044857 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.045184 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.045217 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.045564 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.045625 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:37.545607736 +0000 UTC m=+21.193019371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.045721 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.046931 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.047083 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.047834 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.047852 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.047862 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048002 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048156 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048187 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048455 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048553 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048630 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048795 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048826 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.048854 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.049199 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.049444 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.049901 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.050972 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.051131 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.051197 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.054107 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.054636 4681 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.057624 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.057782 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.057928 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.058008 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.058160 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.058640 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.058670 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.058943 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.059229 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.059300 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.059932 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.062025 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.065105 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.068474 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.069121 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.069342 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.069592 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.070018 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.070252 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.070084 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.071113 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.071485 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.071940 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.073086 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.073280 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.073489 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.082227 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.083514 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.084101 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.086079 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.086112 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.086125 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.086180 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:37.586163408 +0000 UTC m=+21.233574963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.090260 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.092293 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.092999 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.093816 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.094701 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.094741 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.094932 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.094992 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.095634 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.096130 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.096567 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.096590 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.096601 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.096644 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:37.596627752 +0000 UTC m=+21.244039297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.097172 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.098023 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.098841 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.100470 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.105157 4681 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.105720 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.117637 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.118668 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.119804 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.119856 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.119977 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.119994 4681 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120004 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120016 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120025 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120034 4681 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120042 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120054 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120063 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120071 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120080 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120093 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120102 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120111 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120122 4681 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120134 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120143 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120151 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120163 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120171 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120179 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120188 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120198 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120207 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120215 4681 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120225 4681 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120236 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120244 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120253 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120265 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120263 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120522 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120275 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120551 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120561 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120573 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120581 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120589 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120599 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120617 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120625 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120633 4681 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120643 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120651 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120659 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120667 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120678 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120687 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120696 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120704 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120715 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120725 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120734 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120745 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120754 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120763 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120771 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120782 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120791 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120799 4681 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120808 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120819 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120828 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120836 4681 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120844 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120855 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120862 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120871 4681 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120897 4681 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120905 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120914 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120923 4681 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120934 4681 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120941 4681 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120950 4681 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120958 4681 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120968 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120977 4681 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120984 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.120994 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121002 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121010 4681 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121017 4681 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121028 4681 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121036 4681 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121044 4681 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121052 4681 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121063 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121071 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121079 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121098 4681 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121106 4681 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121115 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121126 4681 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121136 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121145 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121153 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121163 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121171 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121179 4681 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121188 4681 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121198 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121206 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121214 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121224 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121240 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121248 4681 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121256 4681 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121267 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121275 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121283 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121291 4681 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121302 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121310 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121318 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121326 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121337 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121345 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121353 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121363 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121372 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121381 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121389 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121399 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121407 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121415 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121434 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121446 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121454 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121462 4681 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121472 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121480 4681 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121489 4681 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121498 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121508 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121517 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121525 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121533 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121544 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121553 4681 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121561 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121570 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121580 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121588 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121596 4681 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121606 4681 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121614 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121622 4681 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121631 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121641 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121649 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121657 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121665 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121677 4681 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121686 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121694 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121706 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121714 4681 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121722 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121730 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121740 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121749 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121757 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121764 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121775 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121782 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121790 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121798 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121809 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121817 4681 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121825 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121840 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121848 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121856 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121864 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121887 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121896 4681 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121904 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121912 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121922 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121931 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121939 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121947 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121957 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121965 4681 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121973 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121983 4681 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121991 4681 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.121999 4681 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.122007 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.122017 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.122024 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.126742 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.135196 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.138140 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.135267 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.143547 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.144532 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.145984 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.148984 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.152358 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.153508 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.154166 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.154437 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.155408 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.156630 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.157360 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.158289 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.158850 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.159708 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.160426 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.161242 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.161697 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.162526 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.163419 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.163975 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.164359 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.164785 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.172263 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.185923 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.197229 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.210811 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.221347 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.222583 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.269565 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.275795 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 17:03:37 crc kubenswrapper[4681]: W1007 17:03:37.284159 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-5bee273e2035b00fecce34ee3d0c140ba66154a1bfdfe8a2c0550b43c758a4be WatchSource:0}: Error finding container 5bee273e2035b00fecce34ee3d0c140ba66154a1bfdfe8a2c0550b43c758a4be: Status 404 returned error can't find the container with id 5bee273e2035b00fecce34ee3d0c140ba66154a1bfdfe8a2c0550b43c758a4be Oct 07 17:03:37 crc kubenswrapper[4681]: W1007 17:03:37.286541 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2584202b6876d7a2e47c3e581a76963aadd44f75b55411c6d758c1315344b730 WatchSource:0}: Error finding container 2584202b6876d7a2e47c3e581a76963aadd44f75b55411c6d758c1315344b730: Status 404 returned error can't find the container with id 2584202b6876d7a2e47c3e581a76963aadd44f75b55411c6d758c1315344b730 Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.288566 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 17:03:37 crc kubenswrapper[4681]: W1007 17:03:37.300218 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-fc05d440a8044303a2494f8479461f33f372b33cfde4dfdc5069c10b941513dc WatchSource:0}: Error finding container fc05d440a8044303a2494f8479461f33f372b33cfde4dfdc5069c10b941513dc: Status 404 returned error can't find the container with id fc05d440a8044303a2494f8479461f33f372b33cfde4dfdc5069c10b941513dc Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.580616 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8z5w6"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.580976 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bt6z6"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.581166 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4rn7z"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.581271 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.581289 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.582233 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d6lkl"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.582410 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.583040 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.585606 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.586135 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gm45r"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.586459 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.589435 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.590739 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.590948 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.590987 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.592576 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.592888 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.592993 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.593120 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.594268 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.595099 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.595674 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.595749 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.595940 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.595953 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.596073 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.596990 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.597177 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.597357 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.597779 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.598010 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.598152 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.605242 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.626410 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.626469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.626493 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.626514 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.626530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626575 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:03:38.626549991 +0000 UTC m=+22.273961546 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626621 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626635 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626644 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626642 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626681 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:38.626668524 +0000 UTC m=+22.274080079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626708 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:38.626689525 +0000 UTC m=+22.274101080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626705 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626821 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:38.626800388 +0000 UTC m=+22.274211953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626973 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.626989 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.627002 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: E1007 17:03:37.627032 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:38.627022443 +0000 UTC m=+22.274434078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.629336 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.638586 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.653758 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.671796 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.680895 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.690364 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.698626 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.707954 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.718068 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727793 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-system-cni-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727848 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-os-release\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727865 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-conf-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727906 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727924 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727941 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0888bed1-620e-4a75-bcf8-460b4cd280ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727955 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-os-release\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.727987 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728002 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728019 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlzh6\" (UniqueName: \"kubernetes.io/projected/bcb2afcd-00d7-404d-9142-15c9fa365d2e-kube-api-access-jlzh6\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728072 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728088 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728103 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0888bed1-620e-4a75-bcf8-460b4cd280ea-rootfs\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728127 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-netns\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728145 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-multus\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728160 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-multus-certs\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728226 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728248 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728275 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qph5x\" (UniqueName: \"kubernetes.io/projected/0888bed1-620e-4a75-bcf8-460b4cd280ea-kube-api-access-qph5x\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728299 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d3235e5-a1c4-43c7-ab08-91ac8017289c-hosts-file\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728316 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728332 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728370 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-kubelet\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728390 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwvdw\" (UniqueName: \"kubernetes.io/projected/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-kube-api-access-dwvdw\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728412 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728429 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cnibin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728446 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728461 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728476 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728505 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddm75\" (UniqueName: \"kubernetes.io/projected/5d3235e5-a1c4-43c7-ab08-91ac8017289c-kube-api-access-ddm75\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728525 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-bin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728539 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-daemon-config\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728553 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728569 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728585 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cnibin\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728621 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728658 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-etc-kubernetes\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728681 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728705 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-hostroot\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728739 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728752 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-socket-dir-parent\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728768 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0888bed1-620e-4a75-bcf8-460b4cd280ea-proxy-tls\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728784 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwz2j\" (UniqueName: \"kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728801 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-system-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728818 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cni-binary-copy\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728833 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-k8s-cni-cncf-io\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728848 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.728860 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.737910 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.748008 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.762761 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.771901 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.783623 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.800530 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.808383 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829052 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829245 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829273 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829297 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlzh6\" (UniqueName: \"kubernetes.io/projected/bcb2afcd-00d7-404d-9142-15c9fa365d2e-kube-api-access-jlzh6\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829312 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829325 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829342 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0888bed1-620e-4a75-bcf8-460b4cd280ea-rootfs\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-netns\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829371 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-multus\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829385 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-multus-certs\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829399 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829415 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qph5x\" (UniqueName: \"kubernetes.io/projected/0888bed1-620e-4a75-bcf8-460b4cd280ea-kube-api-access-qph5x\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829431 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d3235e5-a1c4-43c7-ab08-91ac8017289c-hosts-file\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829446 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829466 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-kubelet\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829481 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwvdw\" (UniqueName: \"kubernetes.io/projected/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-kube-api-access-dwvdw\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829494 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829508 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cnibin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829521 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829534 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829547 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829591 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829607 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829624 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddm75\" (UniqueName: \"kubernetes.io/projected/5d3235e5-a1c4-43c7-ab08-91ac8017289c-kube-api-access-ddm75\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829639 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-bin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829653 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-daemon-config\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829678 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829702 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-multus-certs\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829737 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cnibin\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829684 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-netns\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cnibin\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829812 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-etc-kubernetes\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829853 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829907 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829929 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-socket-dir-parent\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829947 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-kubelet\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829948 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-hostroot\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830027 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0888bed1-620e-4a75-bcf8-460b4cd280ea-proxy-tls\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830050 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830084 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-bin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830098 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwz2j\" (UniqueName: \"kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830118 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830123 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-system-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830145 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830163 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cni-binary-copy\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830194 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-k8s-cni-cncf-io\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830228 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-os-release\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830275 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-system-cni-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830299 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-conf-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830311 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830322 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829635 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830353 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830375 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830411 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830444 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830502 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829976 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-hostroot\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0888bed1-620e-4a75-bcf8-460b4cd280ea-rootfs\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830562 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-etc-kubernetes\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830596 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-var-lib-cni-multus\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830638 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-system-cni-dir\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830642 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-binary-copy\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830708 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-system-cni-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831112 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-daemon-config\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831172 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-host-run-k8s-cni-cncf-io\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829859 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831198 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcb2afcd-00d7-404d-9142-15c9fa365d2e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831217 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cnibin\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831243 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-cni-binary-copy\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831253 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831221 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831255 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-socket-dir-parent\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831199 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-multus-conf-dir\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831260 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.829663 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5d3235e5-a1c4-43c7-ab08-91ac8017289c-hosts-file\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831293 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcb2afcd-00d7-404d-9142-15c9fa365d2e-os-release\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831304 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.830449 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0888bed1-620e-4a75-bcf8-460b4cd280ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831375 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-os-release\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831473 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831487 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831730 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.831447 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-os-release\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.832039 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0888bed1-620e-4a75-bcf8-460b4cd280ea-mcd-auth-proxy-config\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.832152 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.832175 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.832330 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.834400 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.837286 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.838306 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0888bed1-620e-4a75-bcf8-460b4cd280ea-proxy-tls\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.849483 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.849594 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwz2j\" (UniqueName: \"kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j\") pod \"ovnkube-node-d6lkl\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.852056 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwvdw\" (UniqueName: \"kubernetes.io/projected/78a1d2b3-3c0e-49f1-877c-db4f34d3154b-kube-api-access-dwvdw\") pod \"multus-bt6z6\" (UID: \"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\") " pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.854089 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qph5x\" (UniqueName: \"kubernetes.io/projected/0888bed1-620e-4a75-bcf8-460b4cd280ea-kube-api-access-qph5x\") pod \"machine-config-daemon-8z5w6\" (UID: \"0888bed1-620e-4a75-bcf8-460b4cd280ea\") " pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.856009 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlzh6\" (UniqueName: \"kubernetes.io/projected/bcb2afcd-00d7-404d-9142-15c9fa365d2e-kube-api-access-jlzh6\") pod \"multus-additional-cni-plugins-4rn7z\" (UID: \"bcb2afcd-00d7-404d-9142-15c9fa365d2e\") " pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.856420 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.857312 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.859583 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddm75\" (UniqueName: \"kubernetes.io/projected/5d3235e5-a1c4-43c7-ab08-91ac8017289c-kube-api-access-ddm75\") pod \"node-resolver-gm45r\" (UID: \"5d3235e5-a1c4-43c7-ab08-91ac8017289c\") " pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.883126 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.894021 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bt6z6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.901958 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.908319 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.912351 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.915720 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.923657 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.926399 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.929832 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gm45r" Oct 07 17:03:37 crc kubenswrapper[4681]: W1007 17:03:37.937520 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0888bed1_620e_4a75_bcf8_460b4cd280ea.slice/crio-2143aec30eb8b0ec35710ec3f106eace73a8a5656ff50fd2c0ae5b4891bc39b9 WatchSource:0}: Error finding container 2143aec30eb8b0ec35710ec3f106eace73a8a5656ff50fd2c0ae5b4891bc39b9: Status 404 returned error can't find the container with id 2143aec30eb8b0ec35710ec3f106eace73a8a5656ff50fd2c0ae5b4891bc39b9 Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.940093 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:37 crc kubenswrapper[4681]: W1007 17:03:37.976982 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3235e5_a1c4_43c7_ab08_91ac8017289c.slice/crio-954433b556f86404f5c51077d5b644c67eba8b828ff2f41c2eed049105415b86 WatchSource:0}: Error finding container 954433b556f86404f5c51077d5b644c67eba8b828ff2f41c2eed049105415b86: Status 404 returned error can't find the container with id 954433b556f86404f5c51077d5b644c67eba8b828ff2f41c2eed049105415b86 Oct 07 17:03:37 crc kubenswrapper[4681]: I1007 17:03:37.996149 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.013206 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.036472 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.053588 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.076468 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.092443 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.100564 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.108439 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.116065 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.127661 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.140796 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.142234 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.142285 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5bee273e2035b00fecce34ee3d0c140ba66154a1bfdfe8a2c0550b43c758a4be"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.146500 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"b720ea623dfc5e1a465a899aeb2994c7b62aeaa0357c29f74f906e2e42f9f10e"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.147415 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gm45r" event={"ID":"5d3235e5-a1c4-43c7-ab08-91ac8017289c","Type":"ContainerStarted","Data":"954433b556f86404f5c51077d5b644c67eba8b828ff2f41c2eed049105415b86"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.148671 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fc05d440a8044303a2494f8479461f33f372b33cfde4dfdc5069c10b941513dc"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.150172 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.150616 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.150646 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.150654 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2584202b6876d7a2e47c3e581a76963aadd44f75b55411c6d758c1315344b730"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.157720 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerStarted","Data":"bc97023320cfbec9c704bbbb42d23564aee856c876c6f68eea10b3d713136b24"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.160692 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.166797 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"2143aec30eb8b0ec35710ec3f106eace73a8a5656ff50fd2c0ae5b4891bc39b9"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.169034 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerStarted","Data":"f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.169062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerStarted","Data":"64a506398540e1f3fc08d9c1f2912dc4e2cc0d1c856086866fad4045dfb59693"} Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.176898 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.188864 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.203412 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.216542 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.229576 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.248410 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.258096 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.259370 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.259657 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.259902 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.270057 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.280024 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.305587 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.348311 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.385529 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.424434 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.466898 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.505174 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.546181 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.586441 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:38Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.644663 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.644774 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.644806 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:03:40.644781741 +0000 UTC m=+24.292193296 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.644848 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.644921 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:38 crc kubenswrapper[4681]: I1007 17:03:38.644964 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.644926 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645030 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645059 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645073 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645035 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645042 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645205 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.644982 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645129 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:40.645112229 +0000 UTC m=+24.292523784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645260 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:40.645246143 +0000 UTC m=+24.292657698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645271 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:40.645265843 +0000 UTC m=+24.292677398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:38 crc kubenswrapper[4681]: E1007 17:03:38.645281 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:40.645276843 +0000 UTC m=+24.292688398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.028345 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.028411 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.028450 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:39 crc kubenswrapper[4681]: E1007 17:03:39.028571 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:39 crc kubenswrapper[4681]: E1007 17:03:39.028836 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:39 crc kubenswrapper[4681]: E1007 17:03:39.028904 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.031869 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.032520 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.033211 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.033770 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.034347 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.034834 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.035466 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.036014 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.036627 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.037152 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.037689 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.038405 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.038956 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.039484 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.042423 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.042945 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.043529 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.044276 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.044788 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.045733 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.046320 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.173172 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gm45r" event={"ID":"5d3235e5-a1c4-43c7-ab08-91ac8017289c","Type":"ContainerStarted","Data":"a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.174717 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.176077 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a" exitCode=0 Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.176135 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.178838 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.178901 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.180354 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" exitCode=0 Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.180399 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.180907 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:39 crc kubenswrapper[4681]: E1007 17:03:39.181033 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.192767 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.205651 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.231622 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.248941 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.265705 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.288761 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.299301 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.311533 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.329536 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.359975 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.371841 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.382806 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.393560 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.407251 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.419549 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.432138 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.451795 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.463507 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.473983 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.485310 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.501690 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.513753 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.527633 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.545730 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.587708 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:39 crc kubenswrapper[4681]: I1007 17:03:39.632040 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:39Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.186018 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.186306 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.186316 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.186324 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.187567 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerStarted","Data":"6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca"} Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.205234 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.217409 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.228750 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.244165 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.255841 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.273820 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.288906 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.297258 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.313534 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.332564 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.344574 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.346150 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-nvfz9"] Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.346461 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.347870 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.349118 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.349372 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.349842 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.355506 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.364177 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.376949 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.386619 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.395952 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.405892 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.425532 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.463296 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz4z\" (UniqueName: \"kubernetes.io/projected/a1337a96-93a5-4711-bf76-6e722a4cfd6f-kube-api-access-jbz4z\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.463343 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1337a96-93a5-4711-bf76-6e722a4cfd6f-serviceca\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.463392 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1337a96-93a5-4711-bf76-6e722a4cfd6f-host\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.470552 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.505365 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.549033 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.564580 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz4z\" (UniqueName: \"kubernetes.io/projected/a1337a96-93a5-4711-bf76-6e722a4cfd6f-kube-api-access-jbz4z\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.564626 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1337a96-93a5-4711-bf76-6e722a4cfd6f-serviceca\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.564652 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1337a96-93a5-4711-bf76-6e722a4cfd6f-host\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.564703 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1337a96-93a5-4711-bf76-6e722a4cfd6f-host\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.565561 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1337a96-93a5-4711-bf76-6e722a4cfd6f-serviceca\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.583961 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.620405 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz4z\" (UniqueName: \"kubernetes.io/projected/a1337a96-93a5-4711-bf76-6e722a4cfd6f-kube-api-access-jbz4z\") pod \"node-ca-nvfz9\" (UID: \"a1337a96-93a5-4711-bf76-6e722a4cfd6f\") " pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.646986 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.656403 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nvfz9" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.666008 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.666103 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.666130 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.666154 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.666175 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666263 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666310 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:44.666296646 +0000 UTC m=+28.313708201 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666611 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:03:44.666603215 +0000 UTC m=+28.314014770 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666647 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666668 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:44.666662647 +0000 UTC m=+28.314074202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666715 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666726 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666737 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666756 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:44.666750429 +0000 UTC m=+28.314161984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666796 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666805 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666812 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:40 crc kubenswrapper[4681]: E1007 17:03:40.666828 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:44.666823341 +0000 UTC m=+28.314234886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.685143 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.731302 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.766023 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.804353 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:40Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:40 crc kubenswrapper[4681]: I1007 17:03:40.999754 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.009645 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.012335 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.020162 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.028122 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.028207 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:41 crc kubenswrapper[4681]: E1007 17:03:41.028248 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:41 crc kubenswrapper[4681]: E1007 17:03:41.028333 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.028487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:41 crc kubenswrapper[4681]: E1007 17:03:41.028566 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.033966 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.047655 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.057780 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.068343 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.079640 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.104524 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.145326 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.185155 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.194608 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nvfz9" event={"ID":"a1337a96-93a5-4711-bf76-6e722a4cfd6f","Type":"ContainerStarted","Data":"3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0"} Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.194662 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nvfz9" event={"ID":"a1337a96-93a5-4711-bf76-6e722a4cfd6f","Type":"ContainerStarted","Data":"b6a2a2a9272b05a95fa718b8cc78b96778d2aedd232ffacaefc35f706762d53d"} Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.197326 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca" exitCode=0 Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.197405 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca"} Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.201528 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.201565 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.225894 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: E1007 17:03:41.243915 4681 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.282773 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.323197 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.366493 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.407793 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.445413 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.484359 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.523695 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.565813 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.606793 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.647419 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.687454 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.738086 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.779968 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.805342 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.842559 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.883648 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.929479 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:41 crc kubenswrapper[4681]: I1007 17:03:41.965437 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:41Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.009112 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.205028 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2" exitCode=0 Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.205183 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2"} Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.246771 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.257862 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.269035 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.280193 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.293734 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.304362 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.319790 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.330805 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.366550 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.413658 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.446293 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.493586 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.524351 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.527631 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.528381 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.528550 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.564414 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.605351 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.700501 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.702122 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.702160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.702171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.702275 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.707477 4681 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.707690 4681 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.708489 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.708521 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.708531 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.708544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.708554 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.721812 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.724674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.724714 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.724723 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.724737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.724747 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.737296 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.740210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.740249 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.740258 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.740273 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.740282 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.753123 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.756426 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.756457 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.756489 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.756510 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.756521 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.770093 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.772785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.772815 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.772822 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.772835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.772845 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.784033 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:42Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:42 crc kubenswrapper[4681]: E1007 17:03:42.784147 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.787756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.787797 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.787817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.787839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.787895 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.890312 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.890358 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.890371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.890387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.890400 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.992810 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.992870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.992951 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.992977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:42 crc kubenswrapper[4681]: I1007 17:03:42.993006 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:42Z","lastTransitionTime":"2025-10-07T17:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.028653 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.028735 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:43 crc kubenswrapper[4681]: E1007 17:03:43.028762 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.028816 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:43 crc kubenswrapper[4681]: E1007 17:03:43.028958 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:43 crc kubenswrapper[4681]: E1007 17:03:43.029048 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.095531 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.095585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.095601 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.095636 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.095653 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.198045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.198083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.198094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.198111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.198124 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.209756 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de" exitCode=0 Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.209818 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.215626 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.231063 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.244766 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.259182 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.271202 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.284871 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.299282 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.300957 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.300995 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.301006 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.301023 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.301034 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.312409 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.323736 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.336958 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.349946 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.358524 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.368902 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.388490 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.398816 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.417074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.417105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.417116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.417131 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.417141 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.455279 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:43Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.518690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.518718 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.518727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.518739 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.518747 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.620701 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.620747 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.620757 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.620771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.620780 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.722921 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.722959 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.722968 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.722981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.722990 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.825533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.825586 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.825602 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.825626 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.825642 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.928152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.928192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.928201 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.928216 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:43 crc kubenswrapper[4681]: I1007 17:03:43.928225 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:43Z","lastTransitionTime":"2025-10-07T17:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.030693 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.030956 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.030964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.030976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.030984 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.139677 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.140088 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.140177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.140266 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.140397 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.222684 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerStarted","Data":"92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242570 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242599 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242479 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.242609 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.261553 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.272074 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.285198 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.296150 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.307527 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.319134 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.331095 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.342650 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.344368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.344400 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.344410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.344425 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.344436 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.356361 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.368623 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.377560 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.386158 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.396586 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.409373 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.445976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.446182 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.446298 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.446418 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.446487 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.548944 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.548972 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.548982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.548997 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.549007 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.650945 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.650973 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.650980 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.650997 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.651005 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.704967 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.705119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705184 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.705146913 +0000 UTC m=+36.352558508 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.705261 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705295 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705326 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705347 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.705360 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705423 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.705399249 +0000 UTC m=+36.352810844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.705457 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705525 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705542 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705581 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705596 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705552 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705583 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.705568384 +0000 UTC m=+36.352979979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705676 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.705657956 +0000 UTC m=+36.353069571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:44 crc kubenswrapper[4681]: E1007 17:03:44.705697 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.705683497 +0000 UTC m=+36.353095182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.753786 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.753830 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.753843 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.753867 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.753913 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.857510 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.857548 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.857558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.857575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.857585 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.959488 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.959567 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.959587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.960098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:44 crc kubenswrapper[4681]: I1007 17:03:44.960321 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:44Z","lastTransitionTime":"2025-10-07T17:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.029227 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.029248 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.029366 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:45 crc kubenswrapper[4681]: E1007 17:03:45.029533 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:45 crc kubenswrapper[4681]: E1007 17:03:45.029636 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:45 crc kubenswrapper[4681]: E1007 17:03:45.029793 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.062735 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.062826 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.062859 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.062954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.062982 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.165261 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.165298 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.165310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.165328 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.165339 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.232365 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.232604 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.232617 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.234921 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d" exitCode=0 Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.234977 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.245164 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.260450 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.261529 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.266653 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.266823 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.266926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.267005 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.267090 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.272956 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.284459 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.295232 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.321180 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.331867 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.347614 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.356731 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.366659 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.369020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.369051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.369062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.369077 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.369089 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.376296 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.386688 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.402259 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.414228 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.428435 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.442711 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472145 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472171 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.472962 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.492620 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.501909 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.510349 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.530286 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.544064 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.562792 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.575311 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.575363 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.575376 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.575521 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.575538 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.576591 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.587360 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.597824 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.608786 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.620731 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.635225 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.650704 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:45Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.678506 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.678543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.678552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.678566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.678575 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.779982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.780027 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.780038 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.780056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.780068 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.882743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.882776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.882784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.882796 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.882805 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.985478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.985510 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.985525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.985539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:45 crc kubenswrapper[4681]: I1007 17:03:45.985550 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:45Z","lastTransitionTime":"2025-10-07T17:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.087359 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.087395 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.087406 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.087421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.087434 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.191347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.191403 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.191439 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.191471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.191496 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.241256 4681 generic.go:334] "Generic (PLEG): container finished" podID="bcb2afcd-00d7-404d-9142-15c9fa365d2e" containerID="927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536" exitCode=0 Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.241294 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerDied","Data":"927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.242152 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.254908 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.270675 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.272220 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.285103 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.294288 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.294331 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.294340 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.294353 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.294361 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.296993 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.307494 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.319247 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.331737 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.340373 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.349994 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.368354 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.381193 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.396815 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.396839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.396847 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.396859 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.396870 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.398240 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.409415 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.419652 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.428825 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.438960 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.451071 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.462285 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.472184 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.484744 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.494775 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.498353 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.498378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.498386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.498400 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.498409 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.507377 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.516839 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.525767 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.542119 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.551681 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.568666 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.581393 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.591417 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.601046 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.601081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.601098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.601119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.601134 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.602101 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:46Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.702641 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.702672 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.702681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.702694 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.702703 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.805564 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.805620 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.805636 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.805657 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.805672 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.908490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.908545 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.908563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.908589 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:46 crc kubenswrapper[4681]: I1007 17:03:46.908606 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:46Z","lastTransitionTime":"2025-10-07T17:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.011405 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.011450 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.011459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.011472 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.011482 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.028107 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.028250 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:47 crc kubenswrapper[4681]: E1007 17:03:47.028446 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.028478 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:47 crc kubenswrapper[4681]: E1007 17:03:47.028621 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:47 crc kubenswrapper[4681]: E1007 17:03:47.029133 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.062169 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.086005 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.103819 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.113938 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.113996 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.114012 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.114035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.114051 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.120312 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.136795 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.165481 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.182049 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.194908 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.207280 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.217300 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.217343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.217353 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.217372 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.217383 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.222343 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.239413 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.249105 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" event={"ID":"bcb2afcd-00d7-404d-9142-15c9fa365d2e","Type":"ContainerStarted","Data":"4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.257790 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.282280 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.295008 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.319474 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.319509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.319521 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.319537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.319549 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.339861 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.375294 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.393658 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.415751 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.429455 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.429939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.429958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.429980 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.429990 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.430724 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.447900 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.459863 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.472377 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.484058 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.495326 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.507447 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.516817 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.526570 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.531662 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.531888 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.532031 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.532152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.532237 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.555102 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.568686 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.587064 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.635090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.635137 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.635162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.635185 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.635201 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.737536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.737573 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.737581 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.737596 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.737607 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.840180 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.840232 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.840244 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.840263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.840275 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.944387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.944740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.944756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.944772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:47 crc kubenswrapper[4681]: I1007 17:03:47.944781 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:47Z","lastTransitionTime":"2025-10-07T17:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.047911 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.047970 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.047987 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.048015 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.048033 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.150954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.151105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.151125 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.151158 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.151178 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.253334 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.253376 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.253387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.253405 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.253418 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.255562 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/0.log" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.258864 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993" exitCode=1 Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.258926 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.260375 4681 scope.go:117] "RemoveContainer" containerID="c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.298300 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.319472 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.344672 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715022 5882 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715293 5882 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715395 5882 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715448 5882 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715678 5882 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 17:03:47.716725 5882 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 17:03:47.716771 5882 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:47.716802 5882 factory.go:656] Stopping watch factory\\\\nI1007 17:03:47.716819 5882 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:03:47.716853 5882 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 17:03:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.355373 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.355398 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.355406 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.355419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.355428 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.360509 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.374132 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.390418 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.401440 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.416026 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.426498 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.440492 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.452301 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.457716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.457746 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.457755 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.457769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.457778 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.461633 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.472033 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.482313 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.495897 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:48Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.560078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.560129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.560141 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.560161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.560172 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.662988 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.663030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.663043 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.663060 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.663072 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.766068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.766119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.766133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.766155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.766203 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.868691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.868746 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.868760 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.868781 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.868798 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.972357 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.972439 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.972466 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.972590 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:48 crc kubenswrapper[4681]: I1007 17:03:48.972619 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:48Z","lastTransitionTime":"2025-10-07T17:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.028472 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.028502 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.028472 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:49 crc kubenswrapper[4681]: E1007 17:03:49.028699 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:49 crc kubenswrapper[4681]: E1007 17:03:49.028894 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:49 crc kubenswrapper[4681]: E1007 17:03:49.029055 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.074848 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.074913 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.074926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.074946 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.074961 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.177932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.177968 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.177976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.177993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.178004 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.266557 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/0.log" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.269182 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.269993 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.280723 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.280762 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.280771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.280789 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.280803 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.289034 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.306227 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.335697 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.353818 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.376558 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.383681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.383754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.383772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.383804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.383823 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.395930 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.413189 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.430737 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.449385 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.469564 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.486503 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.486553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.486563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.486589 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.486601 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.487537 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.504226 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.534151 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.558834 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.583910 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715022 5882 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715293 5882 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715395 5882 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715448 5882 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715678 5882 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 17:03:47.716725 5882 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 17:03:47.716771 5882 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:47.716802 5882 factory.go:656] Stopping watch factory\\\\nI1007 17:03:47.716819 5882 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:03:47.716853 5882 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 17:03:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:49Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.589032 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.589070 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.589083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.589105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.589120 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.693102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.693149 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.693157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.693175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.693191 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.796233 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.796323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.796341 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.796422 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.796449 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.900347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.900408 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.900428 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.900462 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:49 crc kubenswrapper[4681]: I1007 17:03:49.900484 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:49Z","lastTransitionTime":"2025-10-07T17:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.003735 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.003910 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.003948 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.003988 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.004012 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.107588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.107674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.107698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.107736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.107763 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.211540 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.211583 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.211597 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.211617 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.211632 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.265850 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862"] Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.266693 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.268667 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.269910 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.275858 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/1.log" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.276839 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/0.log" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.281505 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0" exitCode=1 Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.281563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.281609 4681 scope.go:117] "RemoveContainer" containerID="c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.283475 4681 scope.go:117] "RemoveContainer" containerID="625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0" Oct 07 17:03:50 crc kubenswrapper[4681]: E1007 17:03:50.284035 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.296106 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.311306 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.314262 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.314293 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.314302 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.314318 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.314328 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.325575 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.337754 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.353505 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.362155 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.362664 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.362817 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a2c488b-e563-4bc2-aaec-064d33709f54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.362981 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.363042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8th\" (UniqueName: \"kubernetes.io/projected/8a2c488b-e563-4bc2-aaec-064d33709f54-kube-api-access-ns8th\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.370542 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.380523 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.391412 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.408394 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715022 5882 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715293 5882 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715395 5882 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715448 5882 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715678 5882 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 17:03:47.716725 5882 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 17:03:47.716771 5882 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:47.716802 5882 factory.go:656] Stopping watch factory\\\\nI1007 17:03:47.716819 5882 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:03:47.716853 5882 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 17:03:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.415867 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.415923 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.415936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.415953 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.415965 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.435246 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.445991 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.456765 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.464251 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a2c488b-e563-4bc2-aaec-064d33709f54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.464320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.464351 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8th\" (UniqueName: \"kubernetes.io/projected/8a2c488b-e563-4bc2-aaec-064d33709f54-kube-api-access-ns8th\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.464367 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.464956 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-env-overrides\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.465193 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a2c488b-e563-4bc2-aaec-064d33709f54-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.468394 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.471561 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a2c488b-e563-4bc2-aaec-064d33709f54-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.482827 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8th\" (UniqueName: \"kubernetes.io/projected/8a2c488b-e563-4bc2-aaec-064d33709f54-kube-api-access-ns8th\") pod \"ovnkube-control-plane-749d76644c-52862\" (UID: \"8a2c488b-e563-4bc2-aaec-064d33709f54\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.483151 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.495539 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.507033 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.519767 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.519844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.519858 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.519906 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.519921 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.521917 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.537173 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.547633 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.560564 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.573054 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.585710 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.586287 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: W1007 17:03:50.597388 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2c488b_e563_4bc2_aaec_064d33709f54.slice/crio-5ed312d539251507f0f9f71e58fe00a60507c150ae2eb2a8854a869fc14f1ce3 WatchSource:0}: Error finding container 5ed312d539251507f0f9f71e58fe00a60507c150ae2eb2a8854a869fc14f1ce3: Status 404 returned error can't find the container with id 5ed312d539251507f0f9f71e58fe00a60507c150ae2eb2a8854a869fc14f1ce3 Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.605600 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.617483 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.622150 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.622184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.622195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.622212 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.622226 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.632346 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.649094 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.664241 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.675266 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.697153 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.712905 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.725061 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.725100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.725112 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.725130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.725142 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.732119 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715022 5882 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715293 5882 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715395 5882 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715448 5882 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715678 5882 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 17:03:47.716725 5882 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 17:03:47.716771 5882 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:47.716802 5882 factory.go:656] Stopping watch factory\\\\nI1007 17:03:47.716819 5882 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:03:47.716853 5882 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 17:03:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:50Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.827060 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.827530 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.827544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.827562 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.827575 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.929916 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.930163 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.930235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.930310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:50 crc kubenswrapper[4681]: I1007 17:03:50.930460 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:50Z","lastTransitionTime":"2025-10-07T17:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.028370 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.028656 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.028465 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.028370 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.028994 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.029130 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.032900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.033044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.033126 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.033221 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.033285 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.135817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.136030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.136089 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.136163 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.136219 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.238385 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.238421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.238433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.238447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.238458 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.286699 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" event={"ID":"8a2c488b-e563-4bc2-aaec-064d33709f54","Type":"ContainerStarted","Data":"a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.286749 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" event={"ID":"8a2c488b-e563-4bc2-aaec-064d33709f54","Type":"ContainerStarted","Data":"93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.286763 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" event={"ID":"8a2c488b-e563-4bc2-aaec-064d33709f54","Type":"ContainerStarted","Data":"5ed312d539251507f0f9f71e58fe00a60507c150ae2eb2a8854a869fc14f1ce3"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.289301 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/1.log" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.293777 4681 scope.go:117] "RemoveContainer" containerID="625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.293947 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.309686 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.323837 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.340616 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.340675 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.340693 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.340717 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.340736 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.345166 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.361570 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.373036 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.384560 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xjf9z"] Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.384976 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.385032 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.390109 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.405288 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.419072 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.431923 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.443502 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.443535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.443543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.443557 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.443565 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.447530 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.461517 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.473772 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.473899 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.474413 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvp2l\" (UniqueName: \"kubernetes.io/projected/35b1b84e-518a-4567-8ad9-0e717e9958fb-kube-api-access-jvp2l\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.482425 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.498459 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.511286 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.539216 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7932807ceacea311845174e6b64a4347ff0a80b857d30bf495027926190c993\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"message\\\":\\\"om k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715022 5882 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715293 5882 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715395 5882 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:47.715448 5882 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:47.715678 5882 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1007 17:03:47.716725 5882 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 17:03:47.716771 5882 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:47.716802 5882 factory.go:656] Stopping watch factory\\\\nI1007 17:03:47.716819 5882 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:03:47.716853 5882 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 17:03:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.546070 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.546131 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.546146 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.546493 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.546512 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.554583 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.568329 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.575632 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:51 crc kubenswrapper[4681]: E1007 17:03:51.575725 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:03:52.075702135 +0000 UTC m=+35.723113700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.575469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.576117 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvp2l\" (UniqueName: \"kubernetes.io/projected/35b1b84e-518a-4567-8ad9-0e717e9958fb-kube-api-access-jvp2l\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.580786 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.593253 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvp2l\" (UniqueName: \"kubernetes.io/projected/35b1b84e-518a-4567-8ad9-0e717e9958fb-kube-api-access-jvp2l\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.595722 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.607778 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.619499 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.640480 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.649683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.649717 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.649726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.649743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.649755 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.656411 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.666104 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.674416 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.690661 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.703731 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.719850 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.731939 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.743242 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.752829 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.752863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.752893 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.752912 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.752924 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.757022 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.769166 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:51Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.855207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.855275 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.855296 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.855322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.855340 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.958416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.958503 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.958524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.958586 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:51 crc kubenswrapper[4681]: I1007 17:03:51.958619 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:51Z","lastTransitionTime":"2025-10-07T17:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.061612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.061694 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.061716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.061753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.061776 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.080111 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.080255 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.080326 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:03:53.080310003 +0000 UTC m=+36.727721548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.165487 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.165537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.165552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.165568 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.165579 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.269620 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.269689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.269708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.269736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.269768 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.373435 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.373474 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.373485 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.373502 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.373514 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.475645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.475958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.476056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.476264 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.476385 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.579030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.579339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.579518 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.579647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.579762 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.683368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.683448 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.683473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.683500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.683521 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.786404 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.786749 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.786958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787140 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787302 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787322 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.787440 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:04:08.78740956 +0000 UTC m=+52.434821155 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787787 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787858 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.787953 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.788001 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788117 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788140 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788146 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788167 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788203 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788206 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788220 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:08.788193111 +0000 UTC m=+52.435604696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788237 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788254 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:08.788237642 +0000 UTC m=+52.435649237 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788116 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788350 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:08.788308604 +0000 UTC m=+52.435720199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:03:52 crc kubenswrapper[4681]: E1007 17:03:52.788391 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:08.788372216 +0000 UTC m=+52.435783901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.891290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.891347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.891364 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.891388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.891405 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.993931 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.993972 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.993982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.993999 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:52 crc kubenswrapper[4681]: I1007 17:03:52.994011 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:52Z","lastTransitionTime":"2025-10-07T17:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.025130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.025166 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.025174 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.025190 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.025198 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.028236 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.028250 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.028383 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.028544 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.028606 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.028689 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.028853 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.029063 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.042682 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:53Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.047344 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.047382 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.047393 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.047407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.047418 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.065405 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:53Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.070231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.070291 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.070310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.070341 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.070361 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.087993 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:53Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.091179 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.091379 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.091500 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:03:55.091474201 +0000 UTC m=+38.738885756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.092122 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.092157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.092169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.092187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.092209 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.110072 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:53Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.114982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.115020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.115029 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.115048 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.115060 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.132033 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:53Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:53 crc kubenswrapper[4681]: E1007 17:03:53.132262 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.134186 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.134215 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.134223 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.134240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.134256 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.237002 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.237041 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.237053 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.237072 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.237085 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.339351 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.339387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.339397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.339411 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.339422 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.442639 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.442704 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.442726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.442756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.442780 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.546161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.546230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.546252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.546279 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.546301 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.648679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.648713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.648721 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.648734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.648743 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.751302 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.751368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.751386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.751409 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.751427 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.854450 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.854499 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.854515 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.854537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.854553 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.957309 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.957362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.957378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.957401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:53 crc kubenswrapper[4681]: I1007 17:03:53.957417 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:53Z","lastTransitionTime":"2025-10-07T17:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.059739 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.059793 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.059805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.059825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.059839 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.162326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.162363 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.162375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.162391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.162402 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.265190 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.265236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.265248 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.265266 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.265277 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.368417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.368517 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.368539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.368580 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.368605 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.472628 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.472688 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.472707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.472734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.472759 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.576018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.576099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.576123 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.576163 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.576195 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.679313 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.679375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.679396 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.679423 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.679442 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.783492 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.783546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.783556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.783574 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.783586 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.886567 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.886621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.886634 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.886656 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.886672 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.990853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.990994 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.991019 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.991061 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:54 crc kubenswrapper[4681]: I1007 17:03:54.991085 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:54Z","lastTransitionTime":"2025-10-07T17:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.028408 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.028552 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.028843 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.028826 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.029003 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.029074 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.029134 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.029333 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.095401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.095936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.096020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.096110 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.096231 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.115495 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.115826 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:55 crc kubenswrapper[4681]: E1007 17:03:55.115993 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:03:59.115962783 +0000 UTC m=+42.763374338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.200098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.200691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.200836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.200999 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.201128 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.304793 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.304835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.304844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.305056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.305066 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.408339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.408805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.408932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.409111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.409276 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.512398 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.512448 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.512463 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.512482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.512494 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.615278 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.615351 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.615399 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.615418 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.615432 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.718981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.719027 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.719037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.719055 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.719066 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.821356 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.821414 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.821438 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.821536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.821564 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.925097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.925164 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.925181 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.925212 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:55 crc kubenswrapper[4681]: I1007 17:03:55.925232 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:55Z","lastTransitionTime":"2025-10-07T17:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.029625 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.029676 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.029688 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.029709 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.029721 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.132939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.132974 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.132985 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.133001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.133013 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.236121 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.236169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.236179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.236196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.236207 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.340099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.340229 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.340252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.340283 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.340303 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.443566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.443622 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.443636 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.443657 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.443670 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.546236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.546290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.546302 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.546323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.546336 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.648535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.648678 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.648699 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.648732 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.648753 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.751118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.751155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.751165 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.751178 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.751187 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.853651 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.853707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.853724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.853747 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.853766 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.956528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.956613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.956737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.956775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:56 crc kubenswrapper[4681]: I1007 17:03:56.956838 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:56Z","lastTransitionTime":"2025-10-07T17:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.028599 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:57 crc kubenswrapper[4681]: E1007 17:03:57.028728 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.028793 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:57 crc kubenswrapper[4681]: E1007 17:03:57.028872 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.028965 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:57 crc kubenswrapper[4681]: E1007 17:03:57.029024 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.029057 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:57 crc kubenswrapper[4681]: E1007 17:03:57.029147 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.029619 4681 scope.go:117] "RemoveContainer" containerID="210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.049465 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.060209 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.060251 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.060263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.060280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.060290 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.063378 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.074402 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.086665 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.100596 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.113026 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.123781 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.134821 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.145134 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.156483 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.163014 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.163045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.163057 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.163076 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.163088 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.167721 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.179795 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.189482 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.199724 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.220492 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.231881 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.277443 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.280179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.280209 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.280219 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.280233 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.280244 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.322751 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.324136 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.324914 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.342984 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.355033 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.373453 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.382387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.382421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.382432 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.382449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.382462 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.384536 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.397525 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.408067 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.418124 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.429214 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.439687 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.450811 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.461028 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.469751 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.480169 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.484535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.484562 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.484572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.484584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.484596 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.491227 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.502927 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.511508 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.519686 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:03:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.586608 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.586645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.586655 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.586669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.586681 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.688970 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.689013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.689044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.689061 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.689072 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.791346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.791378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.791385 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.791417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.791426 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.893437 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.893471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.893507 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.893529 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.893541 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.995663 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.995862 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.995979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.996050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:57 crc kubenswrapper[4681]: I1007 17:03:57.996126 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:57Z","lastTransitionTime":"2025-10-07T17:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.097761 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.098022 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.098102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.098179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.098245 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.199704 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.199743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.199754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.199768 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.199777 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.302205 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.302242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.302251 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.302263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.302272 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.404060 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.404279 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.404375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.404468 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.404544 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.510847 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.510909 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.510924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.510941 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.510953 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.613513 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.613555 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.613565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.613578 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.613587 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.716614 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.716727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.716753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.716787 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.716810 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.819438 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.819471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.819482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.819495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.819504 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.922242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.922279 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.922288 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.922303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:58 crc kubenswrapper[4681]: I1007 17:03:58.922312 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:58Z","lastTransitionTime":"2025-10-07T17:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.025547 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.025589 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.025601 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.025618 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.025629 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.029221 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.029377 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.029673 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.029750 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.029910 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.030077 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.030215 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.030526 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.129651 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.129690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.129699 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.129713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.129721 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.154520 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.154648 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:59 crc kubenswrapper[4681]: E1007 17:03:59.154699 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:04:07.154685598 +0000 UTC m=+50.802097153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.233224 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.233267 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.233276 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.233290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.233302 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.336533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.336593 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.336609 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.336632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.336649 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.439231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.439295 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.439308 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.439348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.439361 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.542553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.542615 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.542632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.542656 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.542673 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.645466 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.645649 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.645673 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.645696 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.645742 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.748062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.748115 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.748127 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.748145 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.748157 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.850522 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.850569 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.850577 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.850591 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.850600 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.952283 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.952323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.952331 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.952344 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:03:59 crc kubenswrapper[4681]: I1007 17:03:59.952353 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:03:59Z","lastTransitionTime":"2025-10-07T17:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.054987 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.055033 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.055043 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.055057 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.055067 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.157137 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.157176 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.157184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.157199 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.157213 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.259042 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.259081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.259092 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.259110 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.259121 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.361168 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.361222 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.361235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.361252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.361267 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.463842 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.463904 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.463915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.463933 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.463951 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.567396 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.567433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.567451 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.567466 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.567475 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.670216 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.670257 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.670268 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.670282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.670291 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.772497 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.772551 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.772563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.772584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.772595 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.875408 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.875462 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.875479 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.875506 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.875525 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.978062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.978111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.978120 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.978138 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:00 crc kubenswrapper[4681]: I1007 17:04:00.978149 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:00Z","lastTransitionTime":"2025-10-07T17:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.028932 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:01 crc kubenswrapper[4681]: E1007 17:04:01.029087 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.030589 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:01 crc kubenswrapper[4681]: E1007 17:04:01.037307 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.030824 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:01 crc kubenswrapper[4681]: E1007 17:04:01.037490 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.030717 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:01 crc kubenswrapper[4681]: E1007 17:04:01.037611 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.080834 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.080918 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.080936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.080960 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.080980 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.183650 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.183684 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.183697 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.183713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.183724 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.286268 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.286333 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.286357 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.286383 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.286407 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.388467 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.388497 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.388506 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.388518 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.388527 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.490069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.490099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.490109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.490122 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.490131 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.592631 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.592679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.592691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.592709 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.592725 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.694764 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.694794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.694802 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.694814 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.694823 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.797328 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.797354 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.797362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.797377 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.797385 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.899700 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.899736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.899744 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.899758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:01 crc kubenswrapper[4681]: I1007 17:04:01.899769 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:01Z","lastTransitionTime":"2025-10-07T17:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.002175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.002210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.002221 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.002236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.002246 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.104457 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.104486 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.104494 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.104509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.104517 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.207039 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.207076 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.207085 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.207100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.207112 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.308915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.308960 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.308976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.308991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.309002 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.410793 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.411074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.411176 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.411292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.411401 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.513911 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.514175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.514246 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.514316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.514386 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.616724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.616759 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.616770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.616785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.616795 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.719660 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.719757 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.719784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.719816 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.719838 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.822151 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.822198 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.822210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.822228 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.822239 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.925142 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.925192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.925204 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.925228 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:02 crc kubenswrapper[4681]: I1007 17:04:02.925240 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:02Z","lastTransitionTime":"2025-10-07T17:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028154 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.028285 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028327 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.028444 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028599 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028617 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028648 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.028674 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028685 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.028710 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.028747 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.131379 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.131489 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.131528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.131565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.131588 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.220032 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.220068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.220078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.220094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.220105 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.233651 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:03Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.237441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.237605 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.237711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.237813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.237928 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.253094 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:03Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.256842 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.256893 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.256901 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.256915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.256932 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.280314 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:03Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.285058 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.285128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.285150 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.285179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.285199 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.302271 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:03Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.306250 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.306303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.306314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.306333 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.306346 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.321560 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:03Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:03 crc kubenswrapper[4681]: E1007 17:04:03.321789 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.323691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.323717 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.323725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.323767 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.323777 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.425537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.425565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.425575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.425592 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.425602 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.528355 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.528407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.528424 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.528447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.528463 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.630562 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.630604 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.630619 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.630638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.630654 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.732469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.732512 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.732528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.732550 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.732566 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.834630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.834848 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.834944 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.835011 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.835072 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.937268 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.937294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.937336 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.937348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:03 crc kubenswrapper[4681]: I1007 17:04:03.937356 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:03Z","lastTransitionTime":"2025-10-07T17:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.039484 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.039527 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.039544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.039566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.039582 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.141386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.141417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.141426 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.141438 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.141446 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.243272 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.243325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.243361 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.243398 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.243420 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.345788 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.345820 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.345828 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.345844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.345854 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.448355 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.448405 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.448417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.448435 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.448446 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.550632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.550667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.550675 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.550692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.550701 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.653140 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.653373 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.653446 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.653515 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.653620 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.756096 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.756144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.756158 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.756174 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.756184 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.859001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.859046 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.859059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.859077 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.859089 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.961618 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.961673 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.961685 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.961724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:04 crc kubenswrapper[4681]: I1007 17:04:04.961737 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:04Z","lastTransitionTime":"2025-10-07T17:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.028654 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.028696 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.028681 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.028654 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:05 crc kubenswrapper[4681]: E1007 17:04:05.028810 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:05 crc kubenswrapper[4681]: E1007 17:04:05.028929 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:05 crc kubenswrapper[4681]: E1007 17:04:05.029012 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:05 crc kubenswrapper[4681]: E1007 17:04:05.029125 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.064498 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.064535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.064543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.064556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.064566 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.166627 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.166663 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.166672 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.166684 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.166694 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.269081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.269151 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.269168 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.269193 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.269211 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.371325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.371394 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.371407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.371423 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.371435 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.495242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.495274 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.495282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.495294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.495303 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.597233 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.597285 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.597294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.597308 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.597318 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.699919 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.699978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.699991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.700005 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.700014 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.802522 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.802628 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.802656 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.802693 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.802720 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.905500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.905566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.905585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.905610 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:05 crc kubenswrapper[4681]: I1007 17:04:05.905630 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:05Z","lastTransitionTime":"2025-10-07T17:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.008679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.008726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.008737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.008753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.008767 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.028753 4681 scope.go:117] "RemoveContainer" containerID="625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.111382 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.111446 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.111469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.111495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.111514 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.214198 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.214243 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.214251 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.214264 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.214273 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.317339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.317367 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.317480 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.317495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.317504 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.352319 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/1.log" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.357250 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.357984 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.371942 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.383063 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.398676 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.413989 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.419348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.419402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.419418 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.419442 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.419462 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.438275 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.459604 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.481797 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.508808 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.522178 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.522215 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.522223 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.522236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.522246 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.524311 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.536767 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.546239 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.559530 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.569663 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.580840 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.593749 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.608201 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.621702 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:06Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.623939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.623981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.623992 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.624006 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.624015 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.726084 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.726125 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.726133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.726147 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.726157 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.828115 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.828155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.828164 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.828179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.828247 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.930210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.930498 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.930509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.930525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:06 crc kubenswrapper[4681]: I1007 17:04:06.930536 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:06Z","lastTransitionTime":"2025-10-07T17:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.028390 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.028779 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.028867 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.028795 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.028997 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.029028 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.029080 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.029212 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.035441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.036010 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.036048 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.036068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.036080 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.056250 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.071458 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.084377 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.099104 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.114418 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.130121 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.139587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.139853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.139936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.139966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.140100 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.145138 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.159167 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.169522 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.182103 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.210947 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.223606 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.240559 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.240759 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.240831 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:04:23.240814528 +0000 UTC m=+66.888226093 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.242433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.242478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.242494 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.242516 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.242532 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.244813 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.264175 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.288449 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.321594 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.335974 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.344790 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.344834 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.344845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.344864 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.344877 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.361582 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/2.log" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.362069 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/1.log" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.364303 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" exitCode=1 Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.364333 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.364376 4681 scope.go:117] "RemoveContainer" containerID="625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.365059 4681 scope.go:117] "RemoveContainer" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" Oct 07 17:04:07 crc kubenswrapper[4681]: E1007 17:04:07.365238 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.389943 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.401219 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.417753 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.428800 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.438837 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.446829 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.446907 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.446920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.446935 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.446967 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.457305 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.467851 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.477441 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.487470 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.497364 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.508348 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.519535 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.527634 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.539524 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.548758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.548786 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.548794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.548808 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.548816 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.551007 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.566482 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.576727 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:07Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.650719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.650759 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.650769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.650784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.650794 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.752683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.752721 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.752736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.752752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.752764 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.855253 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.855289 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.855299 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.855314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.855324 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.957890 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.957942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.957955 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.957978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:07 crc kubenswrapper[4681]: I1007 17:04:07.957992 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:07Z","lastTransitionTime":"2025-10-07T17:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.060666 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.060748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.060770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.060789 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.060836 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.163507 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.163640 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.163659 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.163683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.163699 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.252179 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.266111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.266180 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.266213 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.266242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.266264 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.272120 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.286719 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.302257 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.320738 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.330141 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.341943 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.354190 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.367685 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.367712 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.367721 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.367733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.367742 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.368007 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.369141 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/2.log" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.372790 4681 scope.go:117] "RemoveContainer" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.372960 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.380234 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.390739 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.408990 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.420805 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.437653 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://625e34e8f283301933c61a38ab86b870853697c4563bb53ad877cf1cea387ba0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:03:49Z\\\",\\\"message\\\":\\\"licy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 17:03:49.125929 6046 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126286 6046 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126439 6046 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.126680 6046 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1007 17:03:49.126995 6046 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 17:03:49.127071 6046 factory.go:656] Stopping watch factory\\\\nI1007 17:03:49.127071 6046 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1007 17:03:49.127090 6046 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 17:03:49.127215 6046 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.446626 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.458088 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.469709 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.469773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.469784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.469809 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.469818 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.472464 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.481987 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.493193 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.503504 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.514347 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.524363 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.534058 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.543946 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.554261 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.564072 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.571980 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.572013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.572025 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.572050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.572062 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.573495 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.584162 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.594216 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.605409 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.613249 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.622455 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.639411 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.654366 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.671001 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.674407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.674430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.674439 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.674451 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.674460 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.758199 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.766071 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.770611 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.775995 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.776023 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.776033 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.776047 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.776058 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.780749 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.789443 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.799216 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.811625 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.826099 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.837697 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.847928 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.852201 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.852322 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852341 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:04:40.852314012 +0000 UTC m=+84.499725567 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.852359 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.852386 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.852414 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852505 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852544 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:40.852536437 +0000 UTC m=+84.499947992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852540 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852561 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852581 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852600 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852586 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852665 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852540 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852653 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:40.85263484 +0000 UTC m=+84.500046405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852730 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:40.852710202 +0000 UTC m=+84.500121847 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:08 crc kubenswrapper[4681]: E1007 17:04:08.852748 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:04:40.852738913 +0000 UTC m=+84.500150608 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.861806 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.876329 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.877981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.878006 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.878018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.878034 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.878043 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.885624 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.894452 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.904807 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.918365 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.946151 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.968899 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.980001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.980041 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.980050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.980066 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.980077 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:08Z","lastTransitionTime":"2025-10-07T17:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:08 crc kubenswrapper[4681]: I1007 17:04:08.981190 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:08Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.029005 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.029072 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.029014 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:09 crc kubenswrapper[4681]: E1007 17:04:09.029173 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.029182 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:09 crc kubenswrapper[4681]: E1007 17:04:09.029269 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:09 crc kubenswrapper[4681]: E1007 17:04:09.029324 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:09 crc kubenswrapper[4681]: E1007 17:04:09.029470 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.082322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.082358 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.082368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.082380 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.082408 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.185388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.185435 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.185449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.185468 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.185479 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.287732 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.287782 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.287803 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.287820 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.287831 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.389674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.389721 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.389733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.389757 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.389775 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.493103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.493143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.493155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.493171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.493181 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.595769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.595849 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.595871 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.595939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.595962 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.697909 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.697950 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.697961 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.697974 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.697984 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.800798 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.800837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.800847 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.800863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.800893 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.903350 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.903391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.903401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.903419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:09 crc kubenswrapper[4681]: I1007 17:04:09.903428 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:09Z","lastTransitionTime":"2025-10-07T17:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.006293 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.006355 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.006375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.006408 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.006424 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.109411 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.109492 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.109515 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.109569 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.109592 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.212238 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.212323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.212343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.212367 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.212386 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.313931 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.314731 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.314753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.314775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.314790 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.417384 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.417426 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.417441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.417467 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.417482 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.520013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.520059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.520071 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.520089 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.520099 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.622528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.622589 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.622606 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.622633 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.622658 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.724510 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.724562 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.724578 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.724601 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.724622 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.827309 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.827346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.827354 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.827367 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.827376 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.930351 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.930401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.930415 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.930442 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:10 crc kubenswrapper[4681]: I1007 17:04:10.930456 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:10Z","lastTransitionTime":"2025-10-07T17:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.028761 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.028842 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.028872 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.029051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:11 crc kubenswrapper[4681]: E1007 17:04:11.029045 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:11 crc kubenswrapper[4681]: E1007 17:04:11.029199 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:11 crc kubenswrapper[4681]: E1007 17:04:11.029369 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:11 crc kubenswrapper[4681]: E1007 17:04:11.029523 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.032206 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.032248 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.032266 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.032289 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.032306 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.134250 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.134290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.134299 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.134314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.134323 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.236674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.236713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.236726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.236738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.236749 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.339207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.339244 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.339254 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.339268 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.339279 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.441519 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.441587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.441606 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.441630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.441651 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.544412 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.544465 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.544476 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.544490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.544499 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.646724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.646763 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.646771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.646785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.646794 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.748992 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.749026 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.749034 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.749046 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.749055 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.851869 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.851916 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.851924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.851937 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.851946 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.954317 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.954362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.954375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.954392 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:11 crc kubenswrapper[4681]: I1007 17:04:11.954404 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:11Z","lastTransitionTime":"2025-10-07T17:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.056727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.056801 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.056824 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.056862 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.056938 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.160316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.160370 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.160388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.160412 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.160429 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.262565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.262626 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.262644 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.262669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.262687 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.364487 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.364549 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.364565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.364591 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.364608 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.467060 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.467098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.467107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.467120 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.467130 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.569983 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.570135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.570158 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.570273 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.570297 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.673556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.673667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.673682 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.673698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.673707 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.776329 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.776362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.776372 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.776388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.776398 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.879854 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.880034 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.880058 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.880091 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.880157 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.982669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.982718 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.982734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.982804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:12 crc kubenswrapper[4681]: I1007 17:04:12.982821 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:12Z","lastTransitionTime":"2025-10-07T17:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.029007 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.029025 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.029122 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.029265 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.029321 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.029471 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.029541 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.029418 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.085759 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.085820 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.085836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.085859 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.085909 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.189632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.189716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.189741 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.189772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.189795 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.292626 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.292703 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.292725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.292754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.292817 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.395567 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.395617 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.395629 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.395647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.395661 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.445423 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.445511 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.445534 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.445561 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.445579 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.471710 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:13Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.476611 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.476652 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.476668 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.476689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.476706 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.498184 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:13Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.509719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.509813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.509835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.509866 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.509922 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.536076 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:13Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.540845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.540915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.540932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.540955 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.540971 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.560329 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:13Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.565194 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.565285 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.565305 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.565334 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.565353 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.583859 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:13Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:13 crc kubenswrapper[4681]: E1007 17:04:13.584141 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.586544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.586594 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.586612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.586635 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.586653 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.689966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.690049 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.690107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.690133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.690149 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.792713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.792788 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.792808 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.792834 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.792851 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.896051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.896118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.896135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.896161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:13 crc kubenswrapper[4681]: I1007 17:04:13.896178 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:13Z","lastTransitionTime":"2025-10-07T17:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.000289 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.001044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.001078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.001119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.001143 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.104621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.104711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.104727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.104752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.104771 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.208346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.208413 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.208431 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.208458 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.208477 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.311556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.311645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.311662 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.311686 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.311705 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.414272 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.414318 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.414329 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.414346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.414359 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.516703 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.516774 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.516800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.516830 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.516863 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.619698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.619744 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.619756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.619771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.619783 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.722601 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.722639 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.722647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.722665 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.722674 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.826366 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.826440 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.826459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.826488 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.826508 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.929226 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.929255 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.929263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.929277 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:14 crc kubenswrapper[4681]: I1007 17:04:14.929286 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:14Z","lastTransitionTime":"2025-10-07T17:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.028974 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.029042 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.029041 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:15 crc kubenswrapper[4681]: E1007 17:04:15.029124 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.029001 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:15 crc kubenswrapper[4681]: E1007 17:04:15.029284 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:15 crc kubenswrapper[4681]: E1007 17:04:15.029477 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:15 crc kubenswrapper[4681]: E1007 17:04:15.029527 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.031145 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.031182 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.031197 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.031215 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.031232 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.134033 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.134086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.134098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.134115 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.134126 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.236605 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.236670 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.236690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.236716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.236733 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.339100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.339391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.339507 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.339575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.339632 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.442290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.442332 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.442343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.442358 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.442370 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.544825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.545171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.545236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.545303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.545378 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.648771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.649047 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.649152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.649230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.649300 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.752705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.752756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.752774 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.752796 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.752814 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.855355 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.855698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.855920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.856068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.856198 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.959305 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.959585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.959659 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.959732 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:15 crc kubenswrapper[4681]: I1007 17:04:15.959822 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:15Z","lastTransitionTime":"2025-10-07T17:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.062996 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.063982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.064022 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.064045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.064063 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.167353 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.167433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.167459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.167492 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.167517 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.270286 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.270482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.271104 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.271261 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.271373 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.373733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.374011 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.374094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.374187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.374293 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.477312 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.477572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.477860 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.478012 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.478135 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.581028 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.581079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.581095 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.581119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.581136 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.683937 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.683979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.683997 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.684020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.684037 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.786734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.786794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.786813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.786837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.786854 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.889679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.889750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.889775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.889805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.889828 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.992440 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.992472 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.992482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.992500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:16 crc kubenswrapper[4681]: I1007 17:04:16.992512 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:16Z","lastTransitionTime":"2025-10-07T17:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.028259 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.028366 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.028455 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:17 crc kubenswrapper[4681]: E1007 17:04:17.028463 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:17 crc kubenswrapper[4681]: E1007 17:04:17.028585 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:17 crc kubenswrapper[4681]: E1007 17:04:17.028683 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.028856 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:17 crc kubenswrapper[4681]: E1007 17:04:17.028962 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.052441 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.065397 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.085270 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.096175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.096221 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.096271 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.096292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.096307 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.098844 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.111754 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.124011 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.133615 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.145496 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.158399 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.172476 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.186992 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.198792 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.199228 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.199274 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.199286 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.199305 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.199320 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.209086 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.219683 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.231252 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.243002 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.251681 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.259866 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:17Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.302370 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.302646 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.302671 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.302750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.302819 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.405180 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.405225 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.405237 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.405254 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.405266 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.508487 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.508537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.508552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.508570 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.508585 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.610758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.610804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.610813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.610825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.610834 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.713840 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.714126 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.714137 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.714151 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.714160 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.817368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.817410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.817424 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.817443 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.817459 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.920802 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.920834 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.920843 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.920858 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:17 crc kubenswrapper[4681]: I1007 17:04:17.920867 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:17Z","lastTransitionTime":"2025-10-07T17:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.023900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.023943 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.023953 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.023972 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.023981 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.125968 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.126026 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.126046 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.126069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.126085 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.228594 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.228632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.228683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.228704 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.228716 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.331558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.331621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.331641 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.331665 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.331683 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.434032 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.434070 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.434080 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.434098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.434110 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.536642 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.536681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.536696 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.536713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.536724 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.639350 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.639456 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.639502 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.639519 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.639530 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.742566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.742612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.742627 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.742647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.742661 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.845110 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.845148 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.845160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.845176 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.845185 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.947124 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.947408 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.947552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.947652 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:18 crc kubenswrapper[4681]: I1007 17:04:18.947740 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:18Z","lastTransitionTime":"2025-10-07T17:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.028974 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.029103 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:19 crc kubenswrapper[4681]: E1007 17:04:19.029226 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.029260 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.029327 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:19 crc kubenswrapper[4681]: E1007 17:04:19.029452 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:19 crc kubenswrapper[4681]: E1007 17:04:19.029489 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:19 crc kubenswrapper[4681]: E1007 17:04:19.029607 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.050072 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.050114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.050129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.050147 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.050161 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.151948 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.151991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.152008 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.152029 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.152045 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.254546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.254580 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.254589 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.254603 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.254610 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.356269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.356313 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.356324 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.356339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.356349 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.458067 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.458280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.458345 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.458425 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.458510 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.561421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.561482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.561493 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.561511 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.561524 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.663708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.663977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.664051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.664131 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.664198 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.766671 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.767056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.767159 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.767252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.767357 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.869300 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.869588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.869692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.869793 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.869900 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.972956 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.973045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.973062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.973083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:19 crc kubenswrapper[4681]: I1007 17:04:19.973097 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:19Z","lastTransitionTime":"2025-10-07T17:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.029324 4681 scope.go:117] "RemoveContainer" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" Oct 07 17:04:20 crc kubenswrapper[4681]: E1007 17:04:20.029598 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.074863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.074924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.074933 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.074954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.074963 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.177705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.177747 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.177758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.177775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.177787 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.280301 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.280597 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.280776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.280971 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.281128 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.383491 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.383519 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.383526 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.383538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.383546 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.486429 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.486459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.486467 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.486479 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.486487 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.589066 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.589102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.589114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.589129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.589139 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.691563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.691600 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.691609 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.691623 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.691632 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.794638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.794696 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.794706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.794719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.794731 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.896443 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.896470 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.896478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.896490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.896499 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.998469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.998735 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.998804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.998863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:20 crc kubenswrapper[4681]: I1007 17:04:20.998940 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:20Z","lastTransitionTime":"2025-10-07T17:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.028906 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.028960 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:21 crc kubenswrapper[4681]: E1007 17:04:21.029240 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.029021 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.029044 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:21 crc kubenswrapper[4681]: E1007 17:04:21.029250 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:21 crc kubenswrapper[4681]: E1007 17:04:21.029417 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:21 crc kubenswrapper[4681]: E1007 17:04:21.029466 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.100543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.100839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.100921 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.100991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.101056 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.202750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.203280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.203352 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.203434 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.203499 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.306255 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.306495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.306584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.306672 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.306763 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.410544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.410598 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.410615 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.410639 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.410656 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.512688 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.512740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.512756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.512777 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.512793 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.614858 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.614956 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.614977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.614999 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.615013 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.717473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.717516 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.717529 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.717546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.717557 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.820013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.820069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.820081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.820099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.820111 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.922074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.922143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.922156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.922171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:21 crc kubenswrapper[4681]: I1007 17:04:21.922181 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:21Z","lastTransitionTime":"2025-10-07T17:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.024165 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.024447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.024516 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.024588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.024656 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.127348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.127595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.127666 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.127735 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.127801 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.229636 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.230143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.230213 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.230290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.230356 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.333342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.333404 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.333431 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.333459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.333478 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.435532 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.435584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.435595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.435609 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.435620 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.537481 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.537511 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.537521 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.537533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.537542 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.639770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.640035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.640143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.640244 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.640349 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.742017 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.742069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.742080 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.742096 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.742107 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.844173 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.844404 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.844473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.844573 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.844639 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.947128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.947169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.947179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.947192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:22 crc kubenswrapper[4681]: I1007 17:04:22.947201 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:22Z","lastTransitionTime":"2025-10-07T17:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.042029 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.042089 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.042142 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.042105 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.042234 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.042280 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.042374 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.042448 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.048900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.049129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.049214 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.049316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.049408 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.151984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.152028 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.152037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.152051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.152060 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.254613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.254674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.254687 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.254705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.254716 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.286478 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.286673 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.286758 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:04:55.286739208 +0000 UTC m=+98.934150763 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.356505 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.356543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.356552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.356566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.356575 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.458604 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.458662 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.458681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.458706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.458722 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.560748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.560824 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.560837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.560853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.560865 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.663856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.663919 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.663936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.663957 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.663973 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.766193 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.766236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.766248 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.766264 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.766273 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.868791 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.868839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.868854 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.868873 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.868908 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.971245 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.971295 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.971307 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.971323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.971336 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.983964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.984015 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.984032 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.984052 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:23 crc kubenswrapper[4681]: I1007 17:04:23.984068 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:23Z","lastTransitionTime":"2025-10-07T17:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:23 crc kubenswrapper[4681]: E1007 17:04:23.997469 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:23Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.001036 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.001071 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.001080 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.001095 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.001104 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: E1007 17:04:24.012290 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:24Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.015490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.015546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.015557 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.015572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.015585 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: E1007 17:04:24.027751 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:24Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.031300 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.031335 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.031347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.031363 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.031374 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: E1007 17:04:24.047963 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:24Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.050924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.050966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.050977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.050995 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.051007 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: E1007 17:04:24.065710 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:24Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:24 crc kubenswrapper[4681]: E1007 17:04:24.065865 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.073048 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.073082 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.073090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.073102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.073110 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.175277 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.175313 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.175323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.175338 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.175350 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.277811 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.277853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.277907 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.277925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.277937 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.380365 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.380401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.380409 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.380422 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.380431 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.482197 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.482233 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.482244 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.482260 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.482270 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.584551 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.584587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.584595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.584608 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.584617 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.686954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.686988 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.686999 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.687014 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.687025 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.789251 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.789283 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.789292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.789306 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.789314 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.892330 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.892375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.892386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.892402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.892417 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.994940 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.994979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.994989 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.995013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:24 crc kubenswrapper[4681]: I1007 17:04:24.995024 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:24Z","lastTransitionTime":"2025-10-07T17:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.028695 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.028807 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.028807 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:25 crc kubenswrapper[4681]: E1007 17:04:25.029121 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:25 crc kubenswrapper[4681]: E1007 17:04:25.028990 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:25 crc kubenswrapper[4681]: E1007 17:04:25.029156 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.028864 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:25 crc kubenswrapper[4681]: E1007 17:04:25.029209 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.097012 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.097054 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.097068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.097088 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.097099 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.222269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.222314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.222326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.222343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.222358 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.324044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.324085 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.324097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.324114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.324125 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.422298 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/0.log" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.422343 4681 generic.go:334] "Generic (PLEG): container finished" podID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" containerID="f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01" exitCode=1 Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.422377 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerDied","Data":"f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.422821 4681 scope.go:117] "RemoveContainer" containerID="f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.426757 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.426792 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.426803 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.426820 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.426839 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.435365 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.446755 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.462264 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.471764 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.481565 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.502076 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.519499 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.533152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.533184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.533191 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.533204 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.533213 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.536432 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.547995 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.561647 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.574596 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.585312 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.596470 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.607681 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.618705 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.630458 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.635010 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.635038 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.635047 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.635062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.635072 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.649080 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.659484 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:25Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.736984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.737010 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.737018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.737030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.737038 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.838740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.838769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.838776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.838788 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.838797 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.941056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.941082 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.941089 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.941100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:25 crc kubenswrapper[4681]: I1007 17:04:25.941108 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:25Z","lastTransitionTime":"2025-10-07T17:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.042818 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.042859 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.042872 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.042912 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.042930 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.144866 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.144928 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.144938 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.144953 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.144962 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.246549 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.246603 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.246618 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.246636 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.246649 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.349660 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.349708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.349719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.349736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.349746 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.427332 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/0.log" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.427393 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerStarted","Data":"bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.446861 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.452769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.452979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.453069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.453157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.453240 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.464956 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.475991 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.487176 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.496144 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.506085 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.517716 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.533465 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.545232 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.556116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.556164 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.556185 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.556208 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.556224 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.559149 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.575529 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.586926 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.603308 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.612498 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.624283 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.636556 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.647404 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.655745 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:26Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.658204 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.658227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.658237 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.658253 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.658264 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.760425 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.760464 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.760477 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.760491 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.760502 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.862817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.862912 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.862926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.862942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.862952 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.964861 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.964912 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.964925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.964941 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:26 crc kubenswrapper[4681]: I1007 17:04:26.964951 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:26Z","lastTransitionTime":"2025-10-07T17:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.028150 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.028201 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.028178 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.028369 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:27 crc kubenswrapper[4681]: E1007 17:04:27.028381 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:27 crc kubenswrapper[4681]: E1007 17:04:27.028474 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:27 crc kubenswrapper[4681]: E1007 17:04:27.028552 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:27 crc kubenswrapper[4681]: E1007 17:04:27.028605 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.041024 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.052298 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.062850 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.067214 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.067263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.067280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.067304 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.067321 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.071544 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.083174 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.096708 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.110273 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.119587 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.127804 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.141174 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.151743 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.166253 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.169321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.169343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.169352 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.169364 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.169374 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.217453 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.237129 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.256508 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.266483 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.271575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.271698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.271758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.271825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.271905 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.275707 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.286948 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:27Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.373059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.373559 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.373642 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.373708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.373774 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.475644 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.475684 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.475693 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.475707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.475719 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.578213 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.578404 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.578495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.578568 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.578624 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.681560 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.681621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.681634 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.681654 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.681670 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.783788 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.783827 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.783838 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.783865 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.783874 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.886585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.886612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.886620 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.886634 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.886642 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.989063 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.989111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.989119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.989132 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:27 crc kubenswrapper[4681]: I1007 17:04:27.989141 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:27Z","lastTransitionTime":"2025-10-07T17:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.091119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.091150 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.091160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.091177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.091187 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.193871 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.193917 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.193925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.193938 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.193946 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.297063 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.297114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.297124 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.297142 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.297155 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.399595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.399643 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.399657 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.399674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.399683 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.501654 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.501692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.501701 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.501716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.501724 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.603799 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.604030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.604044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.604059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.604078 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.706691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.706730 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.706742 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.706761 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.706773 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.808967 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.808992 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.809000 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.809012 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.809020 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.910380 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.910416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.910425 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.910441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:28 crc kubenswrapper[4681]: I1007 17:04:28.910450 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:28Z","lastTransitionTime":"2025-10-07T17:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.013269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.013325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.013348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.013375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.013395 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.029074 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.029133 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:29 crc kubenswrapper[4681]: E1007 17:04:29.029225 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.029084 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:29 crc kubenswrapper[4681]: E1007 17:04:29.029317 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:29 crc kubenswrapper[4681]: E1007 17:04:29.029384 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.029509 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:29 crc kubenswrapper[4681]: E1007 17:04:29.029564 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.115920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.115957 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.115964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.115979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.115987 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.218529 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.218565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.218573 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.218586 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.218595 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.320555 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.320614 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.320626 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.320642 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.320653 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.422310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.422345 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.422356 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.422371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.422382 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.524310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.524348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.524358 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.524373 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.524384 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.626586 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.626625 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.626633 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.626646 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.626656 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.728817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.728856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.728864 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.728896 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.728906 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.831511 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.831544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.831552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.831565 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.831574 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.933476 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.933518 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.933529 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.933545 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:29 crc kubenswrapper[4681]: I1007 17:04:29.933559 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:29Z","lastTransitionTime":"2025-10-07T17:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.035500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.035536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.035545 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.035561 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.035569 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.138093 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.138303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.138391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.138466 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.138530 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.240564 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.240607 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.240620 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.240638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.240650 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.344351 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.345161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.345231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.345294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.345356 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.447781 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.447829 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.447843 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.447859 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.447870 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.549953 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.549988 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.549996 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.550009 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.550020 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.652805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.652907 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.652933 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.652964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.652985 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.755993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.756118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.756143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.756175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.756198 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.860130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.860168 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.860177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.860192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.860204 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.963990 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.964028 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.964037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.964056 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:30 crc kubenswrapper[4681]: I1007 17:04:30.964068 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:30Z","lastTransitionTime":"2025-10-07T17:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.028163 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.028225 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.028223 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:31 crc kubenswrapper[4681]: E1007 17:04:31.028315 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.028364 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:31 crc kubenswrapper[4681]: E1007 17:04:31.028412 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:31 crc kubenswrapper[4681]: E1007 17:04:31.028492 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:31 crc kubenswrapper[4681]: E1007 17:04:31.028565 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.066084 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.066116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.066127 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.066144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.066155 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.168428 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.168621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.168713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.168813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.168893 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.271635 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.271852 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.272019 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.272114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.272205 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.375610 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.375687 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.375715 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.375747 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.375772 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.478933 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.478980 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.478993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.479036 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.479048 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.581752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.581805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.581817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.581835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.581848 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.684335 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.684560 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.684741 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.684857 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.684952 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.788061 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.788419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.788614 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.788838 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.789061 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.892947 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.893216 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.893319 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.893471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.893581 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.995711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.995808 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.995830 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.995854 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:31 crc kubenswrapper[4681]: I1007 17:04:31.995939 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:31Z","lastTransitionTime":"2025-10-07T17:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.098947 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.098977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.098987 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.099004 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.099016 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.201979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.202013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.202021 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.202034 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.202043 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.304804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.304836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.304845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.304858 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.304866 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.407538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.407595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.407611 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.407633 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.407648 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.510707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.510772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.510785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.510801 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.510811 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.614027 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.614091 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.614103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.614119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.614148 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.716429 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.716473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.716484 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.716512 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.716528 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.819155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.819216 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.819235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.819258 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.819275 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.922406 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.922459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.922475 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.922500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:32 crc kubenswrapper[4681]: I1007 17:04:32.922517 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:32Z","lastTransitionTime":"2025-10-07T17:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.025263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.025314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.025326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.025343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.025356 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.029042 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.029073 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:33 crc kubenswrapper[4681]: E1007 17:04:33.029160 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.029183 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:33 crc kubenswrapper[4681]: E1007 17:04:33.029257 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.029293 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:33 crc kubenswrapper[4681]: E1007 17:04:33.029329 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:33 crc kubenswrapper[4681]: E1007 17:04:33.029386 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.128634 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.128679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.128691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.128706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.128715 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.231635 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.231695 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.231707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.231723 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.231733 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.334376 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.334410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.334419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.334434 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.334445 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.437069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.437112 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.437122 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.437137 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.437148 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.539206 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.539246 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.539263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.539282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.539293 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.642560 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.642645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.642659 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.642676 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.642688 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.745658 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.745702 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.745715 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.745735 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.745749 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.848904 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.848948 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.848960 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.848996 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.849006 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.952328 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.952406 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.952430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.952463 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:33 crc kubenswrapper[4681]: I1007 17:04:33.952487 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:33Z","lastTransitionTime":"2025-10-07T17:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.029266 4681 scope.go:117] "RemoveContainer" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.055775 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.055831 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.055843 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.055866 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.055932 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.158624 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.158677 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.158689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.158707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.158719 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.215800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.215865 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.215920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.215951 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.215970 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.234407 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.239386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.239429 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.239442 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.239459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.239470 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.256160 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.260227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.260277 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.260289 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.260307 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.260319 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.276279 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.279919 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.279961 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.279976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.279998 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.280011 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.295698 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.299704 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.299746 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.299760 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.299779 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.299791 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.314337 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: E1007 17:04:34.314460 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.315792 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.315817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.315826 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.315838 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.315846 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.418445 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.418521 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.418534 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.418550 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.418598 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.453725 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/2.log" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.456422 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.456872 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.481116 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.505368 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.519565 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.520407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.520448 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.520458 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.520473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.520482 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.531609 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.546614 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.557789 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.569429 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.579901 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.590725 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.603459 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.617253 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.622800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.622835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.622845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.622862 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.622896 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.629108 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.639395 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.656545 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.672968 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.696525 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.713771 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724351 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724962 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.724985 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.827021 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.827059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.827070 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.827086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.827095 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.929664 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.929712 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.929724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.929741 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:34 crc kubenswrapper[4681]: I1007 17:04:34.929751 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:34Z","lastTransitionTime":"2025-10-07T17:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.028444 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.028461 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.028456 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.028484 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:35 crc kubenswrapper[4681]: E1007 17:04:35.028685 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:35 crc kubenswrapper[4681]: E1007 17:04:35.028784 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:35 crc kubenswrapper[4681]: E1007 17:04:35.028902 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:35 crc kubenswrapper[4681]: E1007 17:04:35.028956 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.033671 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.033700 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.033711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.033725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.033735 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.135926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.136004 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.136014 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.136031 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.136042 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.238156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.238187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.238196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.238210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.238220 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.340982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.341021 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.341029 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.341045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.341054 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.443608 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.443637 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.443645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.443657 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.443666 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.460919 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/3.log" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.461650 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/2.log" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.463933 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" exitCode=1 Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.463969 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.464001 4681 scope.go:117] "RemoveContainer" containerID="500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.464518 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:04:35 crc kubenswrapper[4681]: E1007 17:04:35.464654 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.481196 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.490471 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.499435 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.510935 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.522217 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.539970 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://500a6acc2b5dade410b551572ee9d73898ac20197b5aaf970f732bb2af507087\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:06Z\\\",\\\"message\\\":\\\"Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-controller]} name:Service_openshift-machine-config-operator/machine-config-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.16:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {3f1b9878-e751-4e46-a226-ce007d2c4aa7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 17:04:06.863844 6261 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 17:04:06.865049 6261 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 17:04:06.865103 6261 factory.go:656] Stopping watch factory\\\\nI1007 17:04:06.865140 6261 ovnkube.go:599] Stopped ovnkube\\\\nI1007 17:04:06.865187 6261 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 17:04:06.865224 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 17:04:06.865259 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 17:04:06.865337 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1007 17:04:34.901057 6612 lb_config.go:1031] Cluster endpoints for openshift-machine-api/machine-api-operator-webhook for network=default are: map[]\\\\nI1007 17:04:34.901078 6612 services_controller.go:443] Built service openshift-machine-api/machine-api-operator-webhook LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.254\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1007 17:04:34.901084 6612 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}\\\\nI1007 17:04:34.901064 6612 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-p\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.545765 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.545786 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.545794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.545806 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.545814 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.557352 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.568661 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.582657 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.593781 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.602725 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.612526 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.623310 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.634971 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.647173 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.648469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.648506 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.648516 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.648531 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.648543 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.662895 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.676991 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.691383 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:35Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.751691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.752086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.752157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.752227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.752285 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.854706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.854731 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.854738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.854750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.854759 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.957074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.957114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.957122 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.957141 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:35 crc kubenswrapper[4681]: I1007 17:04:35.957152 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:35Z","lastTransitionTime":"2025-10-07T17:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.037723 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.059049 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.059322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.059430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.059501 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.059565 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.162094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.162347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.162471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.162558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.162645 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.265449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.265768 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.266037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.266239 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.266384 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.368911 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.368941 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.368951 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.368964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.368973 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.471690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.471733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.471742 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.471755 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.471763 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.472736 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/3.log" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.476544 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:04:36 crc kubenswrapper[4681]: E1007 17:04:36.476690 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.506787 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.525515 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.548531 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1007 17:04:34.901057 6612 lb_config.go:1031] Cluster endpoints for openshift-machine-api/machine-api-operator-webhook for network=default are: map[]\\\\nI1007 17:04:34.901078 6612 services_controller.go:443] Built service openshift-machine-api/machine-api-operator-webhook LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.254\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1007 17:04:34.901084 6612 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}\\\\nI1007 17:04:34.901064 6612 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-p\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.562976 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.576206 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.576241 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.576250 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.576264 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.576274 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.579045 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.593899 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.605148 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19107d9c-8793-4766-9661-014743799a9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f498c4ac9c2e2aa10188ede03e77421d502ec718a71af923d84081943350a914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.621587 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.633957 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.644836 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.657725 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.669491 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.680708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.680748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.680762 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.680781 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.680798 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.685082 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.696793 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.709989 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.723557 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.736784 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.754704 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.764522 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:36Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.783111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.783252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.783348 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.783433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.783618 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.886067 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.886116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.886124 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.886136 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.886145 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.989670 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.989719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.989729 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.989742 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:36 crc kubenswrapper[4681]: I1007 17:04:36.989754 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:36Z","lastTransitionTime":"2025-10-07T17:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.028176 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.028252 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.028322 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.028344 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:37 crc kubenswrapper[4681]: E1007 17:04:37.029138 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:37 crc kubenswrapper[4681]: E1007 17:04:37.029259 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:37 crc kubenswrapper[4681]: E1007 17:04:37.029391 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:37 crc kubenswrapper[4681]: E1007 17:04:37.028636 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.041263 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.053433 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.066291 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.075550 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.088867 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.092220 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.092249 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.092256 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.092270 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.092279 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.103538 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.116353 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.130318 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.139439 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.151348 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.162461 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.181827 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1007 17:04:34.901057 6612 lb_config.go:1031] Cluster endpoints for openshift-machine-api/machine-api-operator-webhook for network=default are: map[]\\\\nI1007 17:04:34.901078 6612 services_controller.go:443] Built service openshift-machine-api/machine-api-operator-webhook LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.254\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1007 17:04:34.901084 6612 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}\\\\nI1007 17:04:34.901064 6612 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-p\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.194563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.194604 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.194615 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.194630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.194641 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.200362 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.210820 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19107d9c-8793-4766-9661-014743799a9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f498c4ac9c2e2aa10188ede03e77421d502ec718a71af923d84081943350a914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.222660 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.233283 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.242779 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.252234 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.264984 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:37Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.297227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.297263 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.297274 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.297291 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.297303 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.399071 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.399144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.399153 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.399166 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.399175 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.501276 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.501316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.501324 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.501338 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.501347 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.603833 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.603894 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.603906 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.603923 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.603935 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.705949 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.705991 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.706001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.706015 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.706025 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.808106 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.808148 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.808159 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.808177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.808190 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.911047 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.911086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.911125 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.911144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:37 crc kubenswrapper[4681]: I1007 17:04:37.911155 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:37Z","lastTransitionTime":"2025-10-07T17:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.017910 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.018181 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.018190 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.018205 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.018217 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.120097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.120135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.120144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.120157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.120166 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.222655 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.222696 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.222713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.222733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.222746 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.325017 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.325050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.325058 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.325071 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.325079 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.427492 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.427533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.427543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.427560 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.427575 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.529591 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.529630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.529638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.529651 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.529659 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.632973 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.633011 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.633022 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.633039 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.633050 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.734861 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.734923 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.734938 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.734958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.734973 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.837223 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.837260 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.837270 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.837285 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.837299 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.939798 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.939833 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.939844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.939858 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:38 crc kubenswrapper[4681]: I1007 17:04:38.939870 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:38Z","lastTransitionTime":"2025-10-07T17:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.029187 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.029267 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:39 crc kubenswrapper[4681]: E1007 17:04:39.029319 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.029204 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.029468 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:39 crc kubenswrapper[4681]: E1007 17:04:39.029460 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:39 crc kubenswrapper[4681]: E1007 17:04:39.029579 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:39 crc kubenswrapper[4681]: E1007 17:04:39.029679 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.041494 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.041529 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.041538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.041549 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.041559 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.144229 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.144262 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.144272 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.144287 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.144297 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.246401 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.246443 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.246454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.246470 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.246479 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.349530 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.349562 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.349570 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.349585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.349594 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.451853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.451915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.451925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.451939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.451949 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.554716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.554751 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.554760 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.554772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.554782 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.658109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.658152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.658164 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.658179 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.658194 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.760068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.760110 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.760121 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.760135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.760145 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.862753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.862807 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.862824 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.862846 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.862861 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.965528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.965561 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.965571 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.965584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:39 crc kubenswrapper[4681]: I1007 17:04:39.965593 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:39Z","lastTransitionTime":"2025-10-07T17:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.068116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.068149 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.068156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.068170 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.068179 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.170042 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.170080 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.170090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.170105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.170115 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.272409 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.272451 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.272461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.272474 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.272484 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.375030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.375102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.375111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.375146 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.375155 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.477906 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.477942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.477952 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.477966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.477977 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.580583 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.580647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.580661 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.580679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.580690 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.682447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.682485 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.682495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.682513 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.682522 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.785104 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.785161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.785177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.785200 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.785216 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.876458 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.876548 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.876585 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.876623 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.876652 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876762 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.876727322 +0000 UTC m=+148.524138917 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876781 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876797 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876808 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876846 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.876835986 +0000 UTC m=+148.524247541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.876872 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877066 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877095 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.877087853 +0000 UTC m=+148.524499408 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877108 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.877101723 +0000 UTC m=+148.524513278 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877164 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877178 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877186 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:40 crc kubenswrapper[4681]: E1007 17:04:40.877210 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.877202246 +0000 UTC m=+148.524613801 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.887924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.888130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.888218 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.888303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.888385 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.991162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.991454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.991533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.991630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:40 crc kubenswrapper[4681]: I1007 17:04:40.991703 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:40Z","lastTransitionTime":"2025-10-07T17:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.029094 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.029165 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:41 crc kubenswrapper[4681]: E1007 17:04:41.029248 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.029261 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.029264 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:41 crc kubenswrapper[4681]: E1007 17:04:41.029335 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:41 crc kubenswrapper[4681]: E1007 17:04:41.029418 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:41 crc kubenswrapper[4681]: E1007 17:04:41.029459 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.094210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.094258 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.094269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.094287 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.094299 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.196236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.196525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.196722 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.196955 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.197193 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.299762 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.299801 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.299812 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.299830 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.299844 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.402432 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.402539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.402558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.402581 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.402597 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.504661 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.504741 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.504753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.504769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.504780 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.606944 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.606986 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.606997 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.607013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.607023 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.709168 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.709205 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.709214 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.709230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.709240 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.812083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.812120 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.812131 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.812148 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.812161 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.914347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.914409 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.914429 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.914454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:41 crc kubenswrapper[4681]: I1007 17:04:41.914471 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:41Z","lastTransitionTime":"2025-10-07T17:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.017123 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.017149 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.017157 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.017169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.017178 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.119065 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.119091 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.119098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.119113 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.119121 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.221368 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.221421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.221432 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.221449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.221459 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.323388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.323416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.323424 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.323438 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.323447 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.425579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.425614 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.425625 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.425638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.425646 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.528038 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.528069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.528079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.528094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.528105 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.629692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.629725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.629736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.629751 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.629761 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.731733 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.731785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.731795 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.731810 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.731823 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.833944 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.834008 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.834018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.834034 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.834042 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.935689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.935724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.935732 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.935746 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:42 crc kubenswrapper[4681]: I1007 17:04:42.935755 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:42Z","lastTransitionTime":"2025-10-07T17:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.028627 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.028671 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.028719 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.028643 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:43 crc kubenswrapper[4681]: E1007 17:04:43.028764 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:43 crc kubenswrapper[4681]: E1007 17:04:43.028913 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:43 crc kubenswrapper[4681]: E1007 17:04:43.029011 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:43 crc kubenswrapper[4681]: E1007 17:04:43.029097 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.038234 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.038402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.038484 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.038579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.038668 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.140173 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.140399 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.140485 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.140569 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.140647 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.242722 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.242974 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.243100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.243189 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.243265 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.345530 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.345558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.345566 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.345579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.345589 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.447969 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.448035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.448059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.448087 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.448104 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.551106 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.551436 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.551567 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.551751 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.551977 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.655281 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.655345 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.655371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.655400 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.655422 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.757966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.758683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.758816 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.758951 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.759058 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.862039 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.862074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.862085 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.862099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.862107 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.965539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.965585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.965596 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.965612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:43 crc kubenswrapper[4681]: I1007 17:04:43.965625 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:43Z","lastTransitionTime":"2025-10-07T17:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.069501 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.069543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.069554 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.069572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.069583 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.173153 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.173453 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.173564 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.173682 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.173791 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.277009 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.277078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.277091 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.277109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.277123 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.380152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.380187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.380197 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.380215 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.380225 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.483296 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.483441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.483478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.483520 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.483555 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.586813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.586842 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.586850 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.586861 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.586870 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.688837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.689221 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.689405 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.689593 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.689735 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.691229 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.691258 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.691267 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.691281 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.691291 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.710237 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.714289 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.714459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.714576 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.714680 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.714782 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.732060 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.736288 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.736323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.736333 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.736346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.736355 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.752697 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.755825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.755911 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.755925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.755943 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.755956 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.769809 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.774093 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.774131 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.774142 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.774156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.774167 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.785439 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:44Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:44 crc kubenswrapper[4681]: E1007 17:04:44.785601 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.791622 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.791745 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.791870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.791993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.792071 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.894610 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.894672 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.894692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.894716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.894750 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.997828 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.997864 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.997872 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.997902 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:44 crc kubenswrapper[4681]: I1007 17:04:44.997911 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:44Z","lastTransitionTime":"2025-10-07T17:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.028858 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.028905 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:45 crc kubenswrapper[4681]: E1007 17:04:45.029166 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.029239 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:45 crc kubenswrapper[4681]: E1007 17:04:45.029300 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:45 crc kubenswrapper[4681]: E1007 17:04:45.029428 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.029487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:45 crc kubenswrapper[4681]: E1007 17:04:45.029595 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.100199 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.100274 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.100286 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.100302 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.100314 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.203673 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.203737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.203760 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.203932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.203951 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.306496 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.306546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.306558 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.306574 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.306583 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.409582 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.409668 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.409692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.409723 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.409744 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.511931 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.511995 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.512013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.512050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.512070 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.633183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.633224 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.633236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.633256 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.633270 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.736230 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.736286 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.736297 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.736314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.736326 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.839259 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.839297 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.839308 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.839325 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.839336 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.941682 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.941737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.941753 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.941776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:45 crc kubenswrapper[4681]: I1007 17:04:45.941793 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:45Z","lastTransitionTime":"2025-10-07T17:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.043739 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.043768 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.043777 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.043792 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.043802 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.146069 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.146108 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.146119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.146135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.146145 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.248493 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.248523 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.248531 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.248544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.248552 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.351498 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.351527 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.351536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.351549 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.351558 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.455307 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.455398 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.455430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.455461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.455482 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.558237 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.558292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.558303 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.558321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.558333 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.662260 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.662318 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.662334 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.662356 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.662373 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.765839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.765917 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.765932 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.765949 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.765959 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.869863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.870026 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.870052 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.870079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.870095 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.973832 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.973914 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.973939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.973970 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:46 crc kubenswrapper[4681]: I1007 17:04:46.973991 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:46Z","lastTransitionTime":"2025-10-07T17:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.029157 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.029273 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:47 crc kubenswrapper[4681]: E1007 17:04:47.029390 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.029623 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:47 crc kubenswrapper[4681]: E1007 17:04:47.029733 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.029810 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:47 crc kubenswrapper[4681]: E1007 17:04:47.029936 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:47 crc kubenswrapper[4681]: E1007 17:04:47.029990 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.040764 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19107d9c-8793-4766-9661-014743799a9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f498c4ac9c2e2aa10188ede03e77421d502ec718a71af923d84081943350a914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.062380 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.078024 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.078070 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.078086 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.078109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.078125 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.080903 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.097096 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.116525 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.135415 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.148434 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.160105 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.176418 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.180396 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.180433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.180442 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.180460 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.180470 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.188253 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.201100 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.217827 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.239615 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.250465 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.263097 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.277032 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.282417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.282461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.282478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.282502 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.282519 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.290490 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.317927 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1007 17:04:34.901057 6612 lb_config.go:1031] Cluster endpoints for openshift-machine-api/machine-api-operator-webhook for network=default are: map[]\\\\nI1007 17:04:34.901078 6612 services_controller.go:443] Built service openshift-machine-api/machine-api-operator-webhook LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.254\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1007 17:04:34.901084 6612 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}\\\\nI1007 17:04:34.901064 6612 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-p\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.343348 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:47Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.384718 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.384756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.384764 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.384779 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.384787 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.486773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.486826 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.486856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.486873 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.487164 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.590155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.590183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.590193 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.590207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.590215 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.692669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.692770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.692789 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.692811 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.692827 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.795278 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.795321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.795329 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.795342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.795351 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.897375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.897430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.897442 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.897459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:47 crc kubenswrapper[4681]: I1007 17:04:47.897472 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:47Z","lastTransitionTime":"2025-10-07T17:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.000550 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.000588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.000600 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.000617 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.000628 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.104355 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.104419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.104445 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.104476 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.104498 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.206613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.206665 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.206691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.206711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.206723 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.308547 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.308587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.308598 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.308613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.308622 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.410980 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.411026 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.411036 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.411051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.411061 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.512490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.512524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.512532 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.512545 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.512594 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.614836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.614895 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.614908 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.614927 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.614939 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.717579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.717706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.717731 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.717803 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.717830 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.820270 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.820330 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.820346 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.820370 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.820386 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.923758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.923827 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.923849 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.923922 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:48 crc kubenswrapper[4681]: I1007 17:04:48.923952 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:48Z","lastTransitionTime":"2025-10-07T17:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.026690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.026734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.026745 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.026764 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.026775 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.029052 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.029073 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.029117 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:49 crc kubenswrapper[4681]: E1007 17:04:49.029164 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:49 crc kubenswrapper[4681]: E1007 17:04:49.029369 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.029610 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:49 crc kubenswrapper[4681]: E1007 17:04:49.029803 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:49 crc kubenswrapper[4681]: E1007 17:04:49.030028 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.130752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.130836 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.130847 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.130862 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.130891 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.233536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.233593 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.233606 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.233621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.233630 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.335765 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.335950 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.335984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.336009 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.336026 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.439156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.439202 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.439219 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.439241 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.439258 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.542857 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.542960 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.542979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.543003 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.543019 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.646184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.646238 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.646290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.646313 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.646329 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.749635 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.749676 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.749686 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.749703 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.749712 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.852761 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.852804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.852818 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.852837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.852851 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.955844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.955908 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.955917 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.955931 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:49 crc kubenswrapper[4681]: I1007 17:04:49.955940 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:49Z","lastTransitionTime":"2025-10-07T17:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.058551 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.058588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.058596 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.058609 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.058617 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.161286 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.161397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.161422 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.161454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.161481 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.265338 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.265387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.265399 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.265415 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.265427 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.368366 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.368416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.368429 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.368448 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.368461 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.471133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.471175 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.471193 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.471216 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.471230 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.574534 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.574609 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.574625 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.574645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.574659 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.678322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.678354 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.678364 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.678380 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.678420 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.781681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.781714 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.781724 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.781740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.781750 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.884984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.885037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.885055 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.885081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.885099 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.988994 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.989058 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.989078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.989104 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:50 crc kubenswrapper[4681]: I1007 17:04:50.989126 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:50Z","lastTransitionTime":"2025-10-07T17:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.028994 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.029058 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.029403 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.029404 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:51 crc kubenswrapper[4681]: E1007 17:04:51.029588 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:51 crc kubenswrapper[4681]: E1007 17:04:51.029694 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:51 crc kubenswrapper[4681]: E1007 17:04:51.029789 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:51 crc kubenswrapper[4681]: E1007 17:04:51.030011 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.092453 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.092532 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.092548 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.092616 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.093274 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.195597 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.195644 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.195658 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.195678 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.195692 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.298177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.298209 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.298220 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.298236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.298248 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.400148 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.400178 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.400189 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.400204 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.400215 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.502008 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.502048 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.502068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.502085 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.502096 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.604547 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.604580 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.604590 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.604606 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.604616 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.706546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.706571 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.706579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.706590 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.706599 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.809951 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.809981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.809992 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.810006 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.810016 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.913042 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.913106 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.913128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.913154 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:51 crc kubenswrapper[4681]: I1007 17:04:51.913173 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:51Z","lastTransitionTime":"2025-10-07T17:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.015160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.015196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.015210 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.015226 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.015236 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.028644 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:04:52 crc kubenswrapper[4681]: E1007 17:04:52.028792 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.117231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.117292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.117314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.117342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.117361 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.219575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.219610 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.219618 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.219631 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.219640 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.321914 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.321952 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.321964 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.321979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.321990 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.425304 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.425362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.425378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.425402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.425420 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.528235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.528304 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.528326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.528354 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.528375 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.630595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.630650 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.630667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.630691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.630708 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.733500 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.733553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.733570 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.733593 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.733610 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.836525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.836588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.836607 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.836630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.836647 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.940103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.940174 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.940198 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.940227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:52 crc kubenswrapper[4681]: I1007 17:04:52.940257 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:52Z","lastTransitionTime":"2025-10-07T17:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.028609 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.028653 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.028676 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.028626 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:53 crc kubenswrapper[4681]: E1007 17:04:53.028740 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:53 crc kubenswrapper[4681]: E1007 17:04:53.028817 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:53 crc kubenswrapper[4681]: E1007 17:04:53.028930 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:53 crc kubenswrapper[4681]: E1007 17:04:53.029070 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.042411 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.042458 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.042476 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.042494 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.042513 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.145678 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.145720 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.145730 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.145746 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.145757 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.248282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.248349 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.248371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.248399 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.248420 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.352162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.352239 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.352262 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.352290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.352310 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.455782 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.455844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.455861 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.455913 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.455934 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.558564 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.558628 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.558651 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.558679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.558701 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.662207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.662268 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.662292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.662321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.662343 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.764738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.764804 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.764823 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.764850 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.764871 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.867036 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.867201 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.867238 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.867272 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.867315 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.970494 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.970542 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.970563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.970584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:53 crc kubenswrapper[4681]: I1007 17:04:53.970598 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:53Z","lastTransitionTime":"2025-10-07T17:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.073823 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.073907 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.073925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.073950 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.073969 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.176818 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.176860 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.176868 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.176903 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.176917 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.279683 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.279726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.279739 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.279755 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.279766 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.383347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.383412 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.383428 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.383449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.383466 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.486773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.486816 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.486827 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.486844 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.486864 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.589389 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.589438 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.589455 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.589480 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.589496 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.693376 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.693427 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.693437 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.693454 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.693466 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.795807 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.795850 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.795861 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.795900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.795912 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.898195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.898435 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.898503 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.898563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.898703 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.959287 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.959321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.959330 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.959345 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.959356 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: E1007 17:04:54.972929 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:54Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.976385 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.976544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.976641 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.976743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.976851 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:54 crc kubenswrapper[4681]: E1007 17:04:54.988828 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:54Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.992912 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.992947 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.992956 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.992971 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:54 crc kubenswrapper[4681]: I1007 17:04:54.992981 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:54Z","lastTransitionTime":"2025-10-07T17:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.005590 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:55Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.008668 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.008700 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.008711 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.008726 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.008737 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.019168 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:55Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.021979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.022010 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.022021 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.022035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.022045 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.028733 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.028791 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.028858 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.028961 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.029102 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.029195 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.029488 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.029637 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.033719 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:55Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.033817 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.035062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.035084 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.035094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.035107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.035119 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.137160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.137201 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.137215 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.137234 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.137247 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.240049 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.240096 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.240106 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.240123 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.240135 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.335846 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.336012 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:55 crc kubenswrapper[4681]: E1007 17:04:55.336081 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs podName:35b1b84e-518a-4567-8ad9-0e717e9958fb nodeName:}" failed. No retries permitted until 2025-10-07 17:05:59.3360639 +0000 UTC m=+162.983475465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs") pod "network-metrics-daemon-xjf9z" (UID: "35b1b84e-518a-4567-8ad9-0e717e9958fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.342550 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.342598 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.342613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.342630 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.342642 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.445321 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.445375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.445391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.445412 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.445427 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.548108 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.548133 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.548141 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.548156 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.548167 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.656343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.656378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.656387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.656402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.656412 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.758939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.758966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.758975 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.758988 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.758997 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.861661 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.861696 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.861706 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.861720 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.861730 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.964252 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.964290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.964300 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.964314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:55 crc kubenswrapper[4681]: I1007 17:04:55.964323 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:55Z","lastTransitionTime":"2025-10-07T17:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.066465 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.066512 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.066527 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.066543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.066557 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.168397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.168437 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.168451 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.168466 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.168475 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.270528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.270554 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.270564 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.270579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.270590 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.373119 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.373145 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.373152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.373167 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.373174 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.475257 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.475290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.475297 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.475335 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.475345 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.578850 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.578984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.579014 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.579049 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.579090 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.682705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.682765 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.682784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.682808 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.682828 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.785273 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.785309 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.785320 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.785339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.785350 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.887632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.887714 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.887727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.887743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.888091 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.990378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.990430 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.990447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.990472 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:56 crc kubenswrapper[4681]: I1007 17:04:56.990488 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:56Z","lastTransitionTime":"2025-10-07T17:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.029102 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.029178 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:57 crc kubenswrapper[4681]: E1007 17:04:57.029345 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.029363 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.029395 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:57 crc kubenswrapper[4681]: E1007 17:04:57.030559 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:57 crc kubenswrapper[4681]: E1007 17:04:57.029720 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:57 crc kubenswrapper[4681]: E1007 17:04:57.029460 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.047934 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda12d18a442f88c1fb6f9ccab11ac71496bf3b18b504115359679f3fa7acf1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.063004 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24d811122eeaa1935d7121a38381d5795e11fb8d11a016bc96d465822131d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eba7c9a1b6e639bf95f4269903fcb39db922f7788d5ece99a410c0601b6ae4f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.079321 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.093354 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.093388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.093397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.093410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.093419 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.096939 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35b1b84e-518a-4567-8ad9-0e717e9958fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvp2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xjf9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.111988 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72878526-99b4-467a-a3d5-08583a1d59fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1da7d62c924b38e3a83abdbc714d8be67de22135aee46362c5c14657bda99957\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://639c75a2eb855a61dcbc5d73b5b47e4571c041eb5d1ee18c9ebcdd542dd9982b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c916c6ac3deb07ef80aa6c927e9121c94ef06f15c68e638964451b9a22dbb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc02683342358ea2ba7670184089f4e914220c07af6e8f81f23be6994d2c0bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.125726 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bt6z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"78a1d2b3-3c0e-49f1-877c-db4f34d3154b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:25Z\\\",\\\"message\\\":\\\"2025-10-07T17:03:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22\\\\n2025-10-07T17:03:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b0bc378-da86-4f9a-a1e1-f918eee18f22 to /host/opt/cni/bin/\\\\n2025-10-07T17:03:40Z [verbose] multus-daemon started\\\\n2025-10-07T17:03:40Z [verbose] Readiness Indicator file check\\\\n2025-10-07T17:04:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dwvdw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bt6z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.145318 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcb2afcd-00d7-404d-9142-15c9fa365d2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f774e29adda222a4662ace0017b8b7697481452841491133bf4839360ba9a8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d154bff69a7759ebb8bddb38418860feb34ce4d40bda71a2a88e4666aec034a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6be6de67aaf566b1da50e2c3db92f3a3fc268f6923f6230c3b908f7508b580ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee15977ad63f3e4a04a7bc4596dd6bfefb39f2171f43aaef7b541607e5bf5cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3faf0e5c9bce9ab9d9ed8a287e3c99379718482eed0a2f6454fc568595674de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92b8988db9e1f9b388c5c1343a47566d06dbb3788e40434308038a0dad73567d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://927b9b6795fc90eb9a4934aaa64a315eb9438d8a18f34455eadf48bbf5e58536\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4rn7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.157272 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gm45r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d3235e5-a1c4-43c7-ab08-91ac8017289c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58dd10a49fca8f3230c5280434a0ebf8bac4ce86d07577172067e3f1fa0affe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddm75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gm45r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.168790 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nvfz9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1337a96-93a5-4711-bf76-6e722a4cfd6f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e21357f7e518e641a0373ea02a4439fd38221934ab57b6db888022c41fd7ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jbz4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nvfz9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.179010 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.189159 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.195650 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.195708 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.195720 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.195734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.195763 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.203692 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"615b8d72-0ec5-42d0-966e-db1c2b787962\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T17:04:34Z\\\",\\\"message\\\":\\\"rc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:34Z is after 2025-08-24T17:21:41Z]\\\\nI1007 17:04:34.901057 6612 lb_config.go:1031] Cluster endpoints for openshift-machine-api/machine-api-operator-webhook for network=default are: map[]\\\\nI1007 17:04:34.901078 6612 services_controller.go:443] Built service openshift-machine-api/machine-api-operator-webhook LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.254\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1007 17:04:34.901084 6612 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-daemon\\\\\\\"}\\\\nI1007 17:04:34.901064 6612 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-p\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:04:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwz2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d6lkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.219164 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21eb27d-a100-480e-a47c-62096b9aeab9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8868fabe650a5c809fbec2f2387f2cebb25dc371b0796eb24be99cee5c70eaf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d57dbe0d8916ac55467c7fb6ac009114c94a8915fddcfe57e8c46bfa005d54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71f50c2e4c3d1926e4b7e225b8a729308e177f42850bf648c986b86e740b326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a497be6ad9493d86bd47a7302e70eaa29c07906391a074fb587dc67744f7097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b62ca99f24411f001a70541e5409ac7f2bd7bdd17bcd16530deb6c6180017e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a590a3689d408d3a3a53e97af557b413b860a46e65d962b0ec2653c02c25387f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e338443e2cee426f7d92c559cb1b90b1484a0a4c88b947c8fbf8ffd42efb6ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f60d5a15f5cc808ca8ae750521952290a70c797fabdbd941a9b31eb274a71339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.227251 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19107d9c-8793-4766-9661-014743799a9f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f498c4ac9c2e2aa10188ede03e77421d502ec718a71af923d84081943350a914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0751011a6e111dd3b3d09222e826afe9f712b02143a54130dfa00361cf6d3d98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.238527 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a135a4-7097-4685-934d-d51887be5b99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f08c8e23470ae7bfc6f25b411f84e7dbed781e00a26e23b7c65a87c544df30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://843b59060d01d0a34e08e0c0c624868108517a8fec1fb0c96ebf7d300e4dec8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8a850fca1bffe222abd7161f3f9b423e3e2610a678ce5ac5e8b8124c4f0dde6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8e19e2693808909e599802074123b31395c3cb8992f734a1d7191848532e953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210f5d7d33089968b3d2fad7f84d253afce0009fe9163358e9065f8a3f123492\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T17:03:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW1007 17:03:32.807038 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1007 17:03:32.807175 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 17:03:32.807930 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2405939666/tls.crt::/tmp/serving-cert-2405939666/tls.key\\\\\\\"\\\\nI1007 17:03:33.409780 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1007 17:03:33.413394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1007 17:03:33.413416 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1007 17:03:33.413440 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1007 17:03:33.413446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1007 17:03:33.419149 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1007 17:03:33.419204 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1007 17:03:33.419216 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1007 17:03:33.419218 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1007 17:03:33.419221 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1007 17:03:33.419224 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1007 17:03:33.419359 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1007 17:03:33.420829 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db5c62f5b0d2274c3866d71a82f46b32e867ef420a574e989e9c871e918f9a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f43dde9c13b92ee3464ae7ac41147114c5b3c027ad84833a217226976dd1544\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.247310 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e25255baf27a2dafdc2f2046311a95de229d1e0f8de66357c4e801c1dd7a1308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.256618 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0888bed1-620e-4a75-bcf8-460b4cd280ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95f0be235c2206b52a712ae05fb2c001fbdaf61484e535331d2ceb386182dc8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qph5x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8z5w6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.266149 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a2c488b-e563-4bc2-aaec-064d33709f54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93918b739fba58a9b4e23d3131ae90e34548a07be5a59ceb18accaa6516572e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9f7b57b64232d8263f8b4c64781ddba6d33986e7d379a64caa7b5abc9dae9e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ns8th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-52862\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.275433 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f5d9542-6447-4a77-829b-064c809cf81d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T17:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4eadece9eaef40838cea0c158dfd6208bf8392a02daab5c6e440143e2c9f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://942434d645ee2a2ed25d4535eec28588e1988b53927e54c50c1c15d293fe6a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e99b069a866faa32130576f133ab0a61334f2e7f164cb87f204f032cc3c05391\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T17:03:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f98542ab0c413832a25dc4529fa4fd723dee659e13fc3bff39b944185d969bcb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T17:03:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T17:03:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T17:03:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:04:57Z is after 2025-08-24T17:21:41Z" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.298060 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.298094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.298105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.298118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.298127 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.400364 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.400399 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.400407 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.400422 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.400432 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.502939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.502982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.502994 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.503010 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.503019 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.605623 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.605698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.605723 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.605754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.605776 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.708469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.708502 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.708510 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.708522 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.708530 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.810829 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.810900 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.810909 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.810923 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.810931 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.913338 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.913378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.913387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.913404 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:57 crc kubenswrapper[4681]: I1007 17:04:57.913414 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:57Z","lastTransitionTime":"2025-10-07T17:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.015898 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.015934 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.015942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.015958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.015970 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.119080 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.119118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.119129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.119144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.119156 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.222467 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.222525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.222539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.222559 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.222576 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.325393 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.325431 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.325441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.325455 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.325466 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.428159 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.428196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.428207 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.428223 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.428235 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.531195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.531581 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.531748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.531910 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.532081 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.635169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.635205 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.635213 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.635227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.635236 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.738484 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.738556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.738575 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.738600 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.738616 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.840966 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.841018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.841029 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.841047 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.841061 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.943971 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.944020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.944032 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.944050 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:58 crc kubenswrapper[4681]: I1007 17:04:58.944068 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:58Z","lastTransitionTime":"2025-10-07T17:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.028586 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.028614 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.028725 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:04:59 crc kubenswrapper[4681]: E1007 17:04:59.028864 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.028957 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:04:59 crc kubenswrapper[4681]: E1007 17:04:59.029086 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:04:59 crc kubenswrapper[4681]: E1007 17:04:59.029164 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:04:59 crc kubenswrapper[4681]: E1007 17:04:59.029270 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.046736 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.046774 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.046785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.046830 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.046843 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.150416 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.150474 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.150490 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.150516 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.150534 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.253291 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.253361 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.253383 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.253413 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.253436 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.357075 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.357136 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.357155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.357182 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.357199 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.459756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.459814 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.459831 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.459855 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.459872 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.563323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.563403 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.563428 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.563460 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.563485 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.665749 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.665786 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.665794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.665808 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.665817 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.769461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.769495 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.769505 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.769552 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.769563 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.872444 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.872514 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.872531 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.872553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.872569 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.974755 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.974806 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.974817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.974835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:04:59 crc kubenswrapper[4681]: I1007 17:04:59.974847 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:04:59Z","lastTransitionTime":"2025-10-07T17:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.077918 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.077993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.078002 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.078016 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.078025 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.181327 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.181372 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.181386 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.181403 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.181415 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.287915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.287949 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.287960 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.287974 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.287985 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.389987 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.390027 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.390079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.390097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.390108 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.492933 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.492978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.492990 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.493006 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.493018 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.595391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.595417 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.595424 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.595436 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.595445 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.698183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.698227 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.698240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.698255 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.698266 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.801487 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.801528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.801538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.801583 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.801594 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.904603 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.904707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.904727 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.904752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:00 crc kubenswrapper[4681]: I1007 17:05:00.904773 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:00Z","lastTransitionTime":"2025-10-07T17:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.007715 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.007750 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.007761 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.007776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.007786 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.029282 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.029332 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.029362 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:01 crc kubenswrapper[4681]: E1007 17:05:01.029430 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.029283 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:01 crc kubenswrapper[4681]: E1007 17:05:01.029577 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:01 crc kubenswrapper[4681]: E1007 17:05:01.030224 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:01 crc kubenswrapper[4681]: E1007 17:05:01.030450 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.110812 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.110914 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.110942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.110976 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.111001 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.213929 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.213995 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.214013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.214044 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.214064 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.318254 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.318324 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.318339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.318362 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.318378 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.422380 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.422418 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.422428 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.422446 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.422455 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.529059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.529098 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.529109 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.529124 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.529135 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.632523 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.632579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.632596 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.632624 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.632643 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.735784 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.735855 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.735877 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.735942 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.735965 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.838057 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.838084 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.838092 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.838104 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.838112 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.940603 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.940638 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.940646 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.940660 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:01 crc kubenswrapper[4681]: I1007 17:05:01.940669 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:01Z","lastTransitionTime":"2025-10-07T17:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.044035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.044111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.044141 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.044170 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.044191 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.146078 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.146134 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.146147 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.146162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.146174 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.249065 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.249096 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.249105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.249118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.249128 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.351769 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.351819 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.351839 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.351870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.351928 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.455561 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.455605 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.455621 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.455643 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.455659 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.558432 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.558511 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.558548 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.558580 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.558602 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.662017 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.662071 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.662088 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.662111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.662127 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.764048 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.764079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.764087 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.764099 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.764109 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.866795 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.866825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.866833 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.866845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.866854 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.970238 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.970282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.970291 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.970305 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:02 crc kubenswrapper[4681]: I1007 17:05:02.970315 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:02Z","lastTransitionTime":"2025-10-07T17:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.029142 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.029213 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:03 crc kubenswrapper[4681]: E1007 17:05:03.029246 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.029410 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:03 crc kubenswrapper[4681]: E1007 17:05:03.029513 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.029695 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.029749 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:03 crc kubenswrapper[4681]: E1007 17:05:03.029822 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:05:03 crc kubenswrapper[4681]: E1007 17:05:03.029823 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:03 crc kubenswrapper[4681]: E1007 17:05:03.029868 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.072678 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.072710 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.072722 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.072734 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.072743 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.176236 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.176316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.176329 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.176347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.176359 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.278767 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.278814 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.278824 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.278845 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.278856 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.380984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.381045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.381062 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.381089 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.381109 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.485530 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.485563 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.485572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.485587 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.485597 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.587385 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.587426 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.587436 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.587451 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.587460 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.689918 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.689956 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.689968 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.689986 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.689999 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.793193 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.793232 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.793242 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.793257 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.793266 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.895998 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.896074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.896097 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.896126 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.896145 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.998719 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.998770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.998782 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.998800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:03 crc kubenswrapper[4681]: I1007 17:05:03.998813 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:03Z","lastTransitionTime":"2025-10-07T17:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.100870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.100914 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.100923 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.100936 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.100945 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.203578 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.203633 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.203647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.203667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.203681 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.305486 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.305525 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.305533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.305547 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.305556 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.407229 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.407270 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.407281 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.407304 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.407321 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.509020 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.509059 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.509068 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.509083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.509092 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.612096 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.612469 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.612647 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.612930 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.613122 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.716393 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.716449 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.716464 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.716488 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.716503 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.818707 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.818756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.818770 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.818789 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.818801 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.921486 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.921532 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.921543 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.921557 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:04 crc kubenswrapper[4681]: I1007 17:05:04.921566 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:04Z","lastTransitionTime":"2025-10-07T17:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.023785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.023847 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.023871 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.023928 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.023946 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.028184 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.028231 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.028251 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.028347 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.028382 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.028540 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.028581 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.028655 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.127003 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.127067 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.127085 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.127112 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.127131 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.229986 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.230064 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.230083 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.230135 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.230153 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.265253 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.265308 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.265327 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.265350 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.265367 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.280009 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:05:05Z is after 2025-08-24T17:21:41Z" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.284018 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.284054 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.284066 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.284081 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.284093 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.297532 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:05:05Z is after 2025-08-24T17:21:41Z" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.303315 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.303357 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.303373 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.303403 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.303419 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.324083 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:05:05Z is after 2025-08-24T17:21:41Z" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.329728 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.329766 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.329776 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.329800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.329809 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.343492 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:05:05Z is after 2025-08-24T17:21:41Z" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.347190 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.347220 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.347231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.347246 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.347256 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.360478 4681 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T17:05:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b2328d2-1a46-4216-a128-b08f63d29d00\\\",\\\"systemUUID\\\":\\\"7362d865-50da-43c9-b446-61154f28e86f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T17:05:05Z is after 2025-08-24T17:21:41Z" Oct 07 17:05:05 crc kubenswrapper[4681]: E1007 17:05:05.360597 4681 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.362743 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.362782 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.362794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.362809 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.362820 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.465497 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.466794 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.467118 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.467244 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.467307 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.569751 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.569862 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.569891 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.569925 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.569936 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.673229 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.673277 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.673290 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.673335 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.673348 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.776196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.776231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.776240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.776254 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.776267 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.879183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.879331 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.879342 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.879357 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.879365 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.981975 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.982019 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.982030 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.982045 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:05 crc kubenswrapper[4681]: I1007 17:05:05.982054 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:05Z","lastTransitionTime":"2025-10-07T17:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.084729 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.084762 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.084771 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.084786 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.084795 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.187285 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.187322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.187331 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.187343 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.187353 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.290054 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.290129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.290160 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.290189 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.290209 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.392762 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.392803 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.392814 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.392831 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.392844 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.494977 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.495013 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.495023 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.495037 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.495046 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.597767 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.597813 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.597829 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.597853 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.597870 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.700116 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.700144 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.700151 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.700163 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.700171 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.802855 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.802920 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.802931 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.802945 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.802954 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.905042 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.905090 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.905107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.905128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:06 crc kubenswrapper[4681]: I1007 17:05:06.905142 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:06Z","lastTransitionTime":"2025-10-07T17:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.007094 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.007162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.007183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.007212 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.007236 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.028709 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:07 crc kubenswrapper[4681]: E1007 17:05:07.028807 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.029023 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:07 crc kubenswrapper[4681]: E1007 17:05:07.029087 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.029122 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.029191 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:07 crc kubenswrapper[4681]: E1007 17:05:07.029379 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:07 crc kubenswrapper[4681]: E1007 17:05:07.029216 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.090221 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=86.090197522 podStartE2EDuration="1m26.090197522s" podCreationTimestamp="2025-10-07 17:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.074504147 +0000 UTC m=+110.721915782" watchObservedRunningTime="2025-10-07 17:05:07.090197522 +0000 UTC m=+110.737609107" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.113496 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.113533 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.113544 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.113560 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.113571 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.142594 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=59.142568346 podStartE2EDuration="59.142568346s" podCreationTimestamp="2025-10-07 17:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.142200205 +0000 UTC m=+110.789611790" watchObservedRunningTime="2025-10-07 17:05:07.142568346 +0000 UTC m=+110.789979941" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.176097 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.17607272 podStartE2EDuration="1m29.17607272s" podCreationTimestamp="2025-10-07 17:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.172555417 +0000 UTC m=+110.819966992" watchObservedRunningTime="2025-10-07 17:05:07.17607272 +0000 UTC m=+110.823484275" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.176250 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.176245315 podStartE2EDuration="31.176245315s" podCreationTimestamp="2025-10-07 17:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.158470055 +0000 UTC m=+110.805881620" watchObservedRunningTime="2025-10-07 17:05:07.176245315 +0000 UTC m=+110.823656870" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.216425 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podStartSLOduration=91.216404826 podStartE2EDuration="1m31.216404826s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.201714928 +0000 UTC m=+110.849126503" watchObservedRunningTime="2025-10-07 17:05:07.216404826 +0000 UTC m=+110.863816391" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.216791 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.216929 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.216954 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.217027 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.217058 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.234279 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-52862" podStartSLOduration=90.234256507 podStartE2EDuration="1m30.234256507s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.218085909 +0000 UTC m=+110.865497484" watchObservedRunningTime="2025-10-07 17:05:07.234256507 +0000 UTC m=+110.881668072" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.248533 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=90.248503493 podStartE2EDuration="1m30.248503493s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.235209043 +0000 UTC m=+110.882620618" watchObservedRunningTime="2025-10-07 17:05:07.248503493 +0000 UTC m=+110.895915078" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.320065 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.320103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.320114 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.320128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.320140 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.335464 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bt6z6" podStartSLOduration=91.335446869 podStartE2EDuration="1m31.335446869s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.334959687 +0000 UTC m=+110.982371242" watchObservedRunningTime="2025-10-07 17:05:07.335446869 +0000 UTC m=+110.982858434" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.395469 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gm45r" podStartSLOduration=91.395453175 podStartE2EDuration="1m31.395453175s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.395163007 +0000 UTC m=+111.042574562" watchObservedRunningTime="2025-10-07 17:05:07.395453175 +0000 UTC m=+111.042864730" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.395812 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4rn7z" podStartSLOduration=90.395808044 podStartE2EDuration="1m30.395808044s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.376492144 +0000 UTC m=+111.023903699" watchObservedRunningTime="2025-10-07 17:05:07.395808044 +0000 UTC m=+111.043219599" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.422419 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.422450 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.422459 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.422471 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.422479 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.524772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.524823 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.524834 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.524852 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.524876 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.627655 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.627701 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.627713 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.627730 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.627745 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.730162 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.730211 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.730224 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.730243 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.730257 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.832660 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.832939 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.833016 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.833079 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.833150 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.938095 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.938129 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.938139 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.938153 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:07 crc kubenswrapper[4681]: I1007 17:05:07.938161 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:07Z","lastTransitionTime":"2025-10-07T17:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.040461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.040488 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.040497 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.040509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.040517 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.142682 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.142717 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.142725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.142738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.142746 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.245103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.245178 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.245197 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.245224 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.245242 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.347470 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.347523 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.347535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.347553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.347565 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.450192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.450269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.450283 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.450301 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.450311 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.552663 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.552731 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.552754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.552800 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.552817 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.654679 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.654714 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.654738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.654752 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.654764 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.756462 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.756555 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.756571 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.756600 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.756618 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.859597 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.859669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.859692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.859721 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.859752 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.963208 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.963326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.963347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.963371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:08 crc kubenswrapper[4681]: I1007 17:05:08.963389 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:08Z","lastTransitionTime":"2025-10-07T17:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.028776 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.028837 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.028862 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:09 crc kubenswrapper[4681]: E1007 17:05:09.028989 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.029020 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:09 crc kubenswrapper[4681]: E1007 17:05:09.029087 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:09 crc kubenswrapper[4681]: E1007 17:05:09.029180 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:09 crc kubenswrapper[4681]: E1007 17:05:09.029338 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.067139 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.067177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.067187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.067200 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.067209 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.169952 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.170031 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.170058 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.170091 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.170114 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.273158 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.273217 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.273234 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.273257 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.273273 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.375326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.375372 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.375391 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.375424 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.375458 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.478089 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.478120 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.478130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.478146 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.478156 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.581483 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.581584 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.581604 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.581631 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.581649 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.683950 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.683993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.684001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.684015 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.684024 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.787378 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.787422 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.787434 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.787450 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.787463 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.889614 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.889667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.889678 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.889694 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.889706 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.991709 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.991744 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.991756 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.991772 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:09 crc kubenswrapper[4681]: I1007 17:05:09.991780 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:09Z","lastTransitionTime":"2025-10-07T17:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.093397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.093437 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.093447 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.093464 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.093474 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.195669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.195718 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.195730 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.195748 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.195764 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.298402 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.298450 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.298463 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.298479 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.298491 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.400561 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.400591 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.400598 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.400612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.400623 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.502856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.502915 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.502926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.502947 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.502959 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.605339 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.605385 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.605400 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.605421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.605436 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.707180 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.707217 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.707231 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.707249 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.707262 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.810016 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.810051 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.810061 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.810074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.810084 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.916105 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.916142 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.916152 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.916165 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:10 crc kubenswrapper[4681]: I1007 17:05:10.916174 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:10Z","lastTransitionTime":"2025-10-07T17:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.018313 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.018349 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.018375 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.018388 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.018398 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.028745 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.028768 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.028867 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:11 crc kubenswrapper[4681]: E1007 17:05:11.029052 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.029304 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:11 crc kubenswrapper[4681]: E1007 17:05:11.029428 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:11 crc kubenswrapper[4681]: E1007 17:05:11.029654 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:11 crc kubenswrapper[4681]: E1007 17:05:11.029865 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.120588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.120641 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.120662 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.120691 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.120713 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.224154 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.224213 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.224232 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.224257 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.224274 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.327835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.327955 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.327978 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.328046 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.328063 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.432104 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.432169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.432187 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.432212 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.432228 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.534994 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.535092 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.535128 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.535154 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.535172 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.578673 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/1.log" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.579308 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/0.log" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.579396 4681 generic.go:334] "Generic (PLEG): container finished" podID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" containerID="bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817" exitCode=1 Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.579436 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerDied","Data":"bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.579478 4681 scope.go:117] "RemoveContainer" containerID="f152e51cb8ab291122d2b5325829771621a2d5670dd475a8984d731f3d44df01" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.580450 4681 scope.go:117] "RemoveContainer" containerID="bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817" Oct 07 17:05:11 crc kubenswrapper[4681]: E1007 17:05:11.581058 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bt6z6_openshift-multus(78a1d2b3-3c0e-49f1-877c-db4f34d3154b)\"" pod="openshift-multus/multus-bt6z6" podUID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.610824 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nvfz9" podStartSLOduration=95.610805615 podStartE2EDuration="1m35.610805615s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:07.408060227 +0000 UTC m=+111.055471782" watchObservedRunningTime="2025-10-07 17:05:11.610805615 +0000 UTC m=+115.258217170" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.637628 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.637658 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.637817 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.637832 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.637841 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.739905 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.740002 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.740016 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.740033 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.740044 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.842754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.842805 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.842820 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.842897 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.842949 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.945485 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.945526 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.945540 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.945556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:11 crc kubenswrapper[4681]: I1007 17:05:11.945567 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:11Z","lastTransitionTime":"2025-10-07T17:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.047138 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.047177 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.047188 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.047201 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.047211 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.149645 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.149680 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.149690 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.149705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.149716 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.251924 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.251993 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.252009 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.252035 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.252054 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.354485 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.354524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.354536 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.354553 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.354582 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.456405 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.456461 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.456479 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.456505 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.456579 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.559580 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.559632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.559651 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.559672 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.559693 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.584730 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/1.log" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.662235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.662294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.662306 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.662323 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.662336 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.764783 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.764831 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.764846 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.764863 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.764905 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.867176 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.867295 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.867316 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.867347 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.867367 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.970084 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.970143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.970161 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.970184 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:12 crc kubenswrapper[4681]: I1007 17:05:12.970202 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:12Z","lastTransitionTime":"2025-10-07T17:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.028488 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.028518 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.028546 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:13 crc kubenswrapper[4681]: E1007 17:05:13.028616 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.028702 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:13 crc kubenswrapper[4681]: E1007 17:05:13.028748 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:13 crc kubenswrapper[4681]: E1007 17:05:13.028946 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:13 crc kubenswrapper[4681]: E1007 17:05:13.029719 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.072455 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.072509 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.072528 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.072555 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.072578 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.175245 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.175367 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.175392 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.175421 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.175445 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.278219 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.278269 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.278282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.278301 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.278312 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.381121 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.381159 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.381169 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.381182 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.381191 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.483982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.484038 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.484052 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.484074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.484088 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.586872 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.586961 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.586981 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.587001 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.587019 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.690235 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.690292 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.690306 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.690324 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.690337 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.792353 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.792397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.792410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.792427 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.792437 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.895220 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.895274 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.895284 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.895297 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.895306 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.998192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.998259 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.998282 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.998314 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:13 crc kubenswrapper[4681]: I1007 17:05:13.998338 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:13Z","lastTransitionTime":"2025-10-07T17:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.030365 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:05:14 crc kubenswrapper[4681]: E1007 17:05:14.030630 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d6lkl_openshift-ovn-kubernetes(615b8d72-0ec5-42d0-966e-db1c2b787962)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.101986 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.102053 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.102074 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.102106 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.102130 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.205167 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.205253 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.205278 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.205310 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.205334 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.308127 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.308183 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.308195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.308209 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.308218 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.410705 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.410773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.410787 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.410825 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.410837 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.514250 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.514311 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.514322 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.514338 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.514347 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.616233 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.616306 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.616341 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.616371 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.616393 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.719591 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.719648 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.719669 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.719698 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.719722 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.822665 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.822740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.822764 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.822792 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.822809 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.924984 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.925028 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.925041 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.925063 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:14 crc kubenswrapper[4681]: I1007 17:05:14.925077 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:14Z","lastTransitionTime":"2025-10-07T17:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.026926 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.026961 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.026969 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.026982 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.026992 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.028432 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.028511 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.028438 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.028632 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:15 crc kubenswrapper[4681]: E1007 17:05:15.028623 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:15 crc kubenswrapper[4681]: E1007 17:05:15.028693 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:15 crc kubenswrapper[4681]: E1007 17:05:15.028757 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:15 crc kubenswrapper[4681]: E1007 17:05:15.028801 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.129745 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.129802 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.129819 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.129841 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.129857 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.232171 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.232240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.232265 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.232294 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.232316 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.335039 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.335087 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.335100 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.335117 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.335130 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.438107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.438196 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.438818 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.438856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.438869 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.541650 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.541692 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.541701 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.541716 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.541725 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.628642 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.628674 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.628682 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.628739 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.628750 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T17:05:15Z","lastTransitionTime":"2025-10-07T17:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.674916 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw"] Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.675363 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.677760 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.677871 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.678670 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.681480 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.772944 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.773048 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.773090 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.773311 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.773576 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.875465 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.875555 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.875592 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.876055 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.876414 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.877935 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.878041 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.878125 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.886520 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.898741 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r9wdw\" (UID: \"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:15 crc kubenswrapper[4681]: I1007 17:05:15.993256 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" Oct 07 17:05:16 crc kubenswrapper[4681]: I1007 17:05:16.599101 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" event={"ID":"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa","Type":"ContainerStarted","Data":"50129c1b7a3425952228c883a82da7748f1c5bd7bd8fe72dc6bb3bff55e6d185"} Oct 07 17:05:16 crc kubenswrapper[4681]: I1007 17:05:16.599184 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" event={"ID":"dc36fcf6-2043-4ffc-b0a9-7e3ef0cce1fa","Type":"ContainerStarted","Data":"15832de77f81b9101a2a28d290853b3f480e1f63d411113c63852d02d6bc57a2"} Oct 07 17:05:16 crc kubenswrapper[4681]: I1007 17:05:16.625985 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r9wdw" podStartSLOduration=100.62595565 podStartE2EDuration="1m40.62595565s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:16.622434368 +0000 UTC m=+120.269845953" watchObservedRunningTime="2025-10-07 17:05:16.62595565 +0000 UTC m=+120.273367245" Oct 07 17:05:16 crc kubenswrapper[4681]: E1007 17:05:16.982811 4681 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 17:05:17 crc kubenswrapper[4681]: I1007 17:05:17.028263 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:17 crc kubenswrapper[4681]: I1007 17:05:17.028365 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:17 crc kubenswrapper[4681]: I1007 17:05:17.028398 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:17 crc kubenswrapper[4681]: I1007 17:05:17.028416 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:17 crc kubenswrapper[4681]: E1007 17:05:17.029097 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:17 crc kubenswrapper[4681]: E1007 17:05:17.029286 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:17 crc kubenswrapper[4681]: E1007 17:05:17.029380 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:17 crc kubenswrapper[4681]: E1007 17:05:17.029462 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:17 crc kubenswrapper[4681]: E1007 17:05:17.110216 4681 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 17:05:19 crc kubenswrapper[4681]: I1007 17:05:19.028766 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:19 crc kubenswrapper[4681]: I1007 17:05:19.028795 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:19 crc kubenswrapper[4681]: I1007 17:05:19.028818 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:19 crc kubenswrapper[4681]: I1007 17:05:19.028839 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:19 crc kubenswrapper[4681]: E1007 17:05:19.028905 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:19 crc kubenswrapper[4681]: E1007 17:05:19.029045 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:19 crc kubenswrapper[4681]: E1007 17:05:19.029176 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:19 crc kubenswrapper[4681]: E1007 17:05:19.029243 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:21 crc kubenswrapper[4681]: I1007 17:05:21.028487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:21 crc kubenswrapper[4681]: E1007 17:05:21.028610 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:21 crc kubenswrapper[4681]: I1007 17:05:21.028512 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:21 crc kubenswrapper[4681]: I1007 17:05:21.028487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:21 crc kubenswrapper[4681]: E1007 17:05:21.028683 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:21 crc kubenswrapper[4681]: I1007 17:05:21.028708 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:21 crc kubenswrapper[4681]: E1007 17:05:21.028742 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:21 crc kubenswrapper[4681]: E1007 17:05:21.028852 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:22 crc kubenswrapper[4681]: E1007 17:05:22.112109 4681 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 17:05:23 crc kubenswrapper[4681]: I1007 17:05:23.028839 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:23 crc kubenswrapper[4681]: I1007 17:05:23.028898 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:23 crc kubenswrapper[4681]: I1007 17:05:23.028839 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:23 crc kubenswrapper[4681]: E1007 17:05:23.029076 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:23 crc kubenswrapper[4681]: E1007 17:05:23.029107 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:23 crc kubenswrapper[4681]: E1007 17:05:23.028973 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:23 crc kubenswrapper[4681]: I1007 17:05:23.028953 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:23 crc kubenswrapper[4681]: E1007 17:05:23.029216 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:25 crc kubenswrapper[4681]: I1007 17:05:25.028705 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:25 crc kubenswrapper[4681]: E1007 17:05:25.029089 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:25 crc kubenswrapper[4681]: I1007 17:05:25.028812 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:25 crc kubenswrapper[4681]: E1007 17:05:25.029165 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:25 crc kubenswrapper[4681]: I1007 17:05:25.028810 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:25 crc kubenswrapper[4681]: E1007 17:05:25.029220 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:25 crc kubenswrapper[4681]: I1007 17:05:25.028796 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:25 crc kubenswrapper[4681]: E1007 17:05:25.029265 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:26 crc kubenswrapper[4681]: I1007 17:05:26.029061 4681 scope.go:117] "RemoveContainer" containerID="bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817" Oct 07 17:05:26 crc kubenswrapper[4681]: I1007 17:05:26.632946 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/1.log" Oct 07 17:05:26 crc kubenswrapper[4681]: I1007 17:05:26.633393 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerStarted","Data":"91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09"} Oct 07 17:05:27 crc kubenswrapper[4681]: I1007 17:05:27.028671 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:27 crc kubenswrapper[4681]: I1007 17:05:27.028710 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:27 crc kubenswrapper[4681]: I1007 17:05:27.028736 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:27 crc kubenswrapper[4681]: I1007 17:05:27.028670 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:27 crc kubenswrapper[4681]: E1007 17:05:27.028912 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:27 crc kubenswrapper[4681]: E1007 17:05:27.028991 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:27 crc kubenswrapper[4681]: E1007 17:05:27.029100 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:27 crc kubenswrapper[4681]: E1007 17:05:27.029280 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:27 crc kubenswrapper[4681]: E1007 17:05:27.112862 4681 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.029782 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.642347 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/3.log" Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.644159 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerStarted","Data":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.645062 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.676072 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podStartSLOduration=111.676056056 podStartE2EDuration="1m51.676056056s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:28.675307035 +0000 UTC m=+132.322718610" watchObservedRunningTime="2025-10-07 17:05:28.676056056 +0000 UTC m=+132.323467611" Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.855974 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xjf9z"] Oct 07 17:05:28 crc kubenswrapper[4681]: I1007 17:05:28.856104 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:28 crc kubenswrapper[4681]: E1007 17:05:28.856254 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:29 crc kubenswrapper[4681]: I1007 17:05:29.028593 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:29 crc kubenswrapper[4681]: I1007 17:05:29.028648 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:29 crc kubenswrapper[4681]: E1007 17:05:29.028999 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:29 crc kubenswrapper[4681]: E1007 17:05:29.029123 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:29 crc kubenswrapper[4681]: I1007 17:05:29.029279 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:29 crc kubenswrapper[4681]: E1007 17:05:29.029373 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:30 crc kubenswrapper[4681]: I1007 17:05:30.028607 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:30 crc kubenswrapper[4681]: E1007 17:05:30.028759 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:31 crc kubenswrapper[4681]: I1007 17:05:31.029102 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:31 crc kubenswrapper[4681]: I1007 17:05:31.029160 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:31 crc kubenswrapper[4681]: I1007 17:05:31.029104 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:31 crc kubenswrapper[4681]: E1007 17:05:31.029241 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 17:05:31 crc kubenswrapper[4681]: E1007 17:05:31.029390 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 17:05:31 crc kubenswrapper[4681]: E1007 17:05:31.029548 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 17:05:32 crc kubenswrapper[4681]: I1007 17:05:32.028366 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:32 crc kubenswrapper[4681]: E1007 17:05:32.028683 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xjf9z" podUID="35b1b84e-518a-4567-8ad9-0e717e9958fb" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.028585 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.028632 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.028651 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.032275 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.033516 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.033551 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 17:05:33 crc kubenswrapper[4681]: I1007 17:05:33.033649 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 17:05:34 crc kubenswrapper[4681]: I1007 17:05:34.028622 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:34 crc kubenswrapper[4681]: I1007 17:05:34.031402 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 17:05:34 crc kubenswrapper[4681]: I1007 17:05:34.031974 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.260446 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.310272 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.310630 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.310826 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.311272 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.312790 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.313129 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.313600 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.314098 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgk2c"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.314604 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcg8h"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.314841 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.314966 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.314978 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.315962 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.316266 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.317039 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.317678 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.317861 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.317954 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.318044 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.317682 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.323215 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.323613 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tg8wr"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.324793 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.324943 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.326317 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.327106 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-csgxx"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.327527 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-j8hqs"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.327601 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.328013 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.328754 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.329870 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.330034 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.330316 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.330605 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.330789 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.330972 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.331027 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.331123 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.331170 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.331588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.335707 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.337037 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.337181 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.338352 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.338943 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.339455 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.339980 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.342557 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.343017 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xjzz7"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.343415 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.343746 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.359040 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.359276 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.365473 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.366184 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.366245 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.368538 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.368672 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.368760 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.368891 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.369087 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.369191 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.369834 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.369928 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370022 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370100 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370158 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370279 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370436 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370540 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370780 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370908 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.370974 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.371393 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.371981 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.372176 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.374521 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.378480 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.378760 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.378936 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379009 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379082 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379137 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379224 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379775 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379863 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.379968 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380032 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380087 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380158 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380265 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380452 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.380817 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.381043 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.382006 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.383331 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.383718 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384010 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384057 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384494 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384591 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384671 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384773 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384859 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.384954 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385079 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385186 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385282 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385378 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385493 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385589 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385682 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.385846 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386106 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-client\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386125 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386141 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-config\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386156 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-client\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386171 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386184 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386200 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386216 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvklw\" (UniqueName: \"kubernetes.io/projected/a96ffd28-b774-40e1-ad52-e6fa63483f1d-kube-api-access-kvklw\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386232 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca03f8ab-38f2-4aea-9b61-54526e3c5015-serving-cert\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386473 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-encryption-config\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386492 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386603 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-serving-cert\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386626 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b108e0-c0b3-442b-82c7-4ec003e3de22-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386641 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386658 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386685 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lswx9\" (UniqueName: \"kubernetes.io/projected/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-kube-api-access-lswx9\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386700 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a96ffd28-b774-40e1-ad52-e6fa63483f1d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386716 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-audit\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386731 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386748 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfg79\" (UniqueName: \"kubernetes.io/projected/3017a611-cb0d-4f79-b6f8-2634dc026e2e-kube-api-access-tfg79\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386764 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-serving-cert\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386778 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-audit-dir\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386794 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx96w\" (UniqueName: \"kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386852 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386903 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.386951 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-auth-proxy-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387008 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-service-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387027 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-config\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbp5t\" (UniqueName: \"kubernetes.io/projected/a5b108e0-c0b3-442b-82c7-4ec003e3de22-kube-api-access-fbp5t\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-encryption-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387072 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387086 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387100 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387115 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387157 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba59400b-2ce1-489d-a70d-747f23b176c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387174 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-policies\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387190 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsfp7\" (UniqueName: \"kubernetes.io/projected/77548ba9-d52a-4585-984e-e08c45a58aec-kube-api-access-xsfp7\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387205 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387225 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387253 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-images\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387267 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-serving-cert\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387308 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387357 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387380 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387400 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lvv\" (UniqueName: \"kubernetes.io/projected/cb20182c-f315-41b3-94e2-256dac142821-kube-api-access-k4lvv\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387415 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387437 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387454 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhkj\" (UniqueName: \"kubernetes.io/projected/ca03f8ab-38f2-4aea-9b61-54526e3c5015-kube-api-access-qvhkj\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387470 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387484 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387500 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pzks\" (UniqueName: \"kubernetes.io/projected/ba59400b-2ce1-489d-a70d-747f23b176c6-kube-api-access-4pzks\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387516 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr6z\" (UniqueName: \"kubernetes.io/projected/d712e0bd-952b-4cba-8d92-2c6e72f6b867-kube-api-access-jqr6z\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387533 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387552 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387571 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-config\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387593 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b108e0-c0b3-442b-82c7-4ec003e3de22-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387609 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387626 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-client\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387723 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387762 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3017a611-cb0d-4f79-b6f8-2634dc026e2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387784 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-config\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387799 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-image-import-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387815 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387911 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-node-pullsecrets\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.387956 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdjv\" (UniqueName: \"kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388016 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb20182c-f315-41b3-94e2-256dac142821-machine-approver-tls\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388084 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-trusted-ca\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388104 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388121 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbvx\" (UniqueName: \"kubernetes.io/projected/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-kube-api-access-mmbvx\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388141 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffpz2\" (UniqueName: \"kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388178 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpx2\" (UniqueName: \"kubernetes.io/projected/c669d56a-7d2e-4161-ac70-29d72a747038-kube-api-access-fcpx2\") pod \"downloads-7954f5f757-j8hqs\" (UID: \"c669d56a-7d2e-4161-ac70-29d72a747038\") " pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388197 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388246 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388270 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388288 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-dir\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388310 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-serving-cert\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388327 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388355 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388388 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388416 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9sj6\" (UniqueName: \"kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388436 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcqlw\" (UniqueName: \"kubernetes.io/projected/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-kube-api-access-dcqlw\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388450 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96ffd28-b774-40e1-ad52-e6fa63483f1d-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388466 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.388855 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9m58d"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.389308 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.389359 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.390646 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.390955 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391050 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391101 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391232 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391316 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391390 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391473 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391548 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.391618 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.392114 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.392844 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.393267 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.393431 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.393518 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.393828 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.394412 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.395541 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hcltk"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.395893 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.399194 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w6vtn"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.399438 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.399548 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.400396 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.400765 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.402254 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.402973 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.403799 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.404554 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.407308 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.407992 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.408371 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.412181 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.414101 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.414292 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.420243 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.421428 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.435661 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.440481 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.441636 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.442085 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.442625 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.443862 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.458791 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.460367 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.461127 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.462684 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.462866 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6s884"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.463853 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.464403 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.466935 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.467495 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tg8wr"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.467574 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.472986 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.473288 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.473813 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.475531 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dvzgt"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.475708 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.476386 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w529g"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.476596 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.476774 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.476819 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.484139 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-67xwd"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.485020 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.487327 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.488600 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.488992 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbbbg\" (UniqueName: \"kubernetes.io/projected/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-kube-api-access-qbbbg\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489038 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26c685e0-8c5d-4fdc-a5f4-8b746d285813-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489078 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-config\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489093 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-client\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489134 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-encryption-config\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489147 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489162 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489191 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b108e0-c0b3-442b-82c7-4ec003e3de22-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489209 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a96ffd28-b774-40e1-ad52-e6fa63483f1d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489241 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-default-certificate\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489271 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd63e72-276a-43fe-8927-da5aba5b7a98-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489289 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-serving-cert\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489302 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-audit-dir\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489350 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489366 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfg79\" (UniqueName: \"kubernetes.io/projected/3017a611-cb0d-4f79-b6f8-2634dc026e2e-kube-api-access-tfg79\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489383 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489398 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489429 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67274805-ff68-4381-b1b6-9a6fbf85aca5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489443 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489457 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-config\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489472 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xkd\" (UniqueName: \"kubernetes.io/projected/9fd63e72-276a-43fe-8927-da5aba5b7a98-kube-api-access-r2xkd\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489509 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489527 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brpvt\" (UniqueName: \"kubernetes.io/projected/dd5794df-cde0-4881-921f-9ba7006d4281-kube-api-access-brpvt\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489546 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba59400b-2ce1-489d-a70d-747f23b176c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489561 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489591 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489609 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489623 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-images\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489637 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-serving-cert\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489668 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489685 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489708 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489738 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489754 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26c685e0-8c5d-4fdc-a5f4-8b746d285813-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489770 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhkj\" (UniqueName: \"kubernetes.io/projected/ca03f8ab-38f2-4aea-9b61-54526e3c5015-kube-api-access-qvhkj\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489784 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489798 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-metrics-certs\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489828 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489844 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489864 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-client\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489899 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489922 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3017a611-cb0d-4f79-b6f8-2634dc026e2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489937 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdjv\" (UniqueName: \"kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489952 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdh46\" (UniqueName: \"kubernetes.io/projected/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-kube-api-access-tdh46\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489967 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-node-pullsecrets\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489981 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26c685e0-8c5d-4fdc-a5f4-8b746d285813-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.489996 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-service-ca-bundle\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490010 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d181d51e-2df2-4025-b7aa-282418e6c9da-proxy-tls\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490040 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-trusted-ca\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490056 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbvx\" (UniqueName: \"kubernetes.io/projected/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-kube-api-access-mmbvx\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490089 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpx2\" (UniqueName: \"kubernetes.io/projected/c669d56a-7d2e-4161-ac70-29d72a747038-kube-api-access-fcpx2\") pod \"downloads-7954f5f757-j8hqs\" (UID: \"c669d56a-7d2e-4161-ac70-29d72a747038\") " pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490104 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-serving-cert\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490135 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490156 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490171 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcqlw\" (UniqueName: \"kubernetes.io/projected/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-kube-api-access-dcqlw\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490186 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96ffd28-b774-40e1-ad52-e6fa63483f1d-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490203 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9sj6\" (UniqueName: \"kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490218 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490232 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490261 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490276 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhq9\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-kube-api-access-nxhq9\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490293 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490310 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-client\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490327 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490340 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvklw\" (UniqueName: \"kubernetes.io/projected/a96ffd28-b774-40e1-ad52-e6fa63483f1d-kube-api-access-kvklw\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490369 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.490386 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca03f8ab-38f2-4aea-9b61-54526e3c5015-serving-cert\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.491213 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-serving-cert\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492575 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lswx9\" (UniqueName: \"kubernetes.io/projected/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-kube-api-access-lswx9\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492595 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-audit\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492616 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492640 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx96w\" (UniqueName: \"kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492659 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492677 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g55m\" (UniqueName: \"kubernetes.io/projected/4ee9a95e-d102-45ea-a77b-711de5bd03a9-kube-api-access-4g55m\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492701 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd5794df-cde0-4881-921f-9ba7006d4281-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492720 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-auth-proxy-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492737 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492758 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbp5t\" (UniqueName: \"kubernetes.io/projected/a5b108e0-c0b3-442b-82c7-4ec003e3de22-kube-api-access-fbp5t\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-encryption-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492797 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492814 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-service-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492847 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-stats-auth\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492866 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/005cd89d-7e4e-4bed-aa11-e4d6f871710b-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492923 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-policies\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.492971 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsfp7\" (UniqueName: \"kubernetes.io/projected/77548ba9-d52a-4585-984e-e08c45a58aec-kube-api-access-xsfp7\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493014 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493046 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsljs\" (UniqueName: \"kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493081 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lvv\" (UniqueName: \"kubernetes.io/projected/cb20182c-f315-41b3-94e2-256dac142821-kube-api-access-k4lvv\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493136 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493161 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493178 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493208 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr6z\" (UniqueName: \"kubernetes.io/projected/d712e0bd-952b-4cba-8d92-2c6e72f6b867-kube-api-access-jqr6z\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-config\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493267 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pzks\" (UniqueName: \"kubernetes.io/projected/ba59400b-2ce1-489d-a70d-747f23b176c6-kube-api-access-4pzks\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493297 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-config\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493352 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b108e0-c0b3-442b-82c7-4ec003e3de22-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493369 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493388 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-metrics-tls\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493410 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493428 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwm2\" (UniqueName: \"kubernetes.io/projected/837aa149-aced-4911-bdf3-c25e502dc542-kube-api-access-glwm2\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493446 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrn7\" (UniqueName: \"kubernetes.io/projected/d181d51e-2df2-4025-b7aa-282418e6c9da-kube-api-access-chrn7\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493467 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-config\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493483 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-image-import-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493518 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szlk7\" (UniqueName: \"kubernetes.io/projected/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-kube-api-access-szlk7\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493533 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbbf\" (UniqueName: \"kubernetes.io/projected/1af973a2-4ae5-4a2e-9d36-941f8689054b-kube-api-access-8tbbf\") pod \"migrator-59844c95c7-lq6nd\" (UID: \"1af973a2-4ae5-4a2e-9d36-941f8689054b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493552 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493573 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5g6r\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-kube-api-access-x5g6r\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493589 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493608 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-config\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493621 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-auth-proxy-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493626 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb20182c-f315-41b3-94e2-256dac142821-machine-approver-tls\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493649 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493668 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/67274805-ff68-4381-b1b6-9a6fbf85aca5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493686 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/005cd89d-7e4e-4bed-aa11-e4d6f871710b-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493713 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffpz2\" (UniqueName: \"kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493731 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493747 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493766 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493782 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-dir\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493801 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493820 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.493963 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.494513 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-audit\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.494761 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.495572 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.496500 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.497605 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.498353 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-service-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.498767 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.501227 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-image-import-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.501792 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-service-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.502365 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.503735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.504590 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.505852 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.507922 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.508666 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.511450 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.512320 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-config\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.513472 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b108e0-c0b3-442b-82c7-4ec003e3de22-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.514178 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.515416 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-config\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.519778 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.520712 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca03f8ab-38f2-4aea-9b61-54526e3c5015-config\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.521291 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.521773 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a96ffd28-b774-40e1-ad52-e6fa63483f1d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.522982 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-audit-dir\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.523034 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.523991 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.524764 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-serving-ca\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.525237 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.526120 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca03f8ab-38f2-4aea-9b61-54526e3c5015-serving-cert\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.536126 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcg8h"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.536173 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.536184 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.547216 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3017a611-cb0d-4f79-b6f8-2634dc026e2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.547415 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77548ba9-d52a-4585-984e-e08c45a58aec-node-pullsecrets\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.547720 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-serving-cert\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.548147 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.548442 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.548511 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-trusted-ca\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549022 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549051 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549410 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b108e0-c0b3-442b-82c7-4ec003e3de22-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549426 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-ca\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549464 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-dir\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549825 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549971 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.549999 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-encryption-config\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.550682 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-audit-policies\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.551766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-client\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.554975 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.555407 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.593133 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.593463 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb20182c-f315-41b3-94e2-256dac142821-machine-approver-tls\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.594033 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.594275 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-serving-cert\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.594997 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-metrics-tls\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595036 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrn7\" (UniqueName: \"kubernetes.io/projected/d181d51e-2df2-4025-b7aa-282418e6c9da-kube-api-access-chrn7\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwm2\" (UniqueName: \"kubernetes.io/projected/837aa149-aced-4911-bdf3-c25e502dc542-kube-api-access-glwm2\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595079 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szlk7\" (UniqueName: \"kubernetes.io/projected/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-kube-api-access-szlk7\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595099 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbbf\" (UniqueName: \"kubernetes.io/projected/1af973a2-4ae5-4a2e-9d36-941f8689054b-kube-api-access-8tbbf\") pod \"migrator-59844c95c7-lq6nd\" (UID: \"1af973a2-4ae5-4a2e-9d36-941f8689054b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595122 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5g6r\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-kube-api-access-x5g6r\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595143 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595169 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/67274805-ff68-4381-b1b6-9a6fbf85aca5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595190 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595207 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-config\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595249 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/005cd89d-7e4e-4bed-aa11-e4d6f871710b-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595276 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595297 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26c685e0-8c5d-4fdc-a5f4-8b746d285813-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595327 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbbbg\" (UniqueName: \"kubernetes.io/projected/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-kube-api-access-qbbbg\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595350 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595388 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-default-certificate\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595409 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd63e72-276a-43fe-8927-da5aba5b7a98-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595438 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595460 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67274805-ff68-4381-b1b6-9a6fbf85aca5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595548 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595579 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595635 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26c685e0-8c5d-4fdc-a5f4-8b746d285813-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595654 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-metrics-certs\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595680 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595727 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595765 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdh46\" (UniqueName: \"kubernetes.io/projected/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-kube-api-access-tdh46\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595793 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26c685e0-8c5d-4fdc-a5f4-8b746d285813-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595812 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-service-ca-bundle\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d181d51e-2df2-4025-b7aa-282418e6c9da-proxy-tls\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595866 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.595981 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596017 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596036 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g55m\" (UniqueName: \"kubernetes.io/projected/4ee9a95e-d102-45ea-a77b-711de5bd03a9-kube-api-access-4g55m\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596056 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd5794df-cde0-4881-921f-9ba7006d4281-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596095 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596129 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-stats-auth\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596147 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/005cd89d-7e4e-4bed-aa11-e4d6f871710b-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596176 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsljs\" (UniqueName: \"kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596202 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596221 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.596376 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-config\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.600401 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.600447 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.600469 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.604501 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.608693 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd63e72-276a-43fe-8927-da5aba5b7a98-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.614911 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.615258 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712e0bd-952b-4cba-8d92-2c6e72f6b867-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.619807 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.620283 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.621236 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.623058 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.623459 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb20182c-f315-41b3-94e2-256dac142821-config\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.623721 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-config\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.624283 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.625106 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.625353 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.625450 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.625704 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-serving-cert\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.626061 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-encryption-config\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.626360 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d712e0bd-952b-4cba-8d92-2c6e72f6b867-serving-cert\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.626613 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.627725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96ffd28-b774-40e1-ad52-e6fa63483f1d-serving-cert\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.628058 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.628527 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.629902 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ba59400b-2ce1-489d-a70d-747f23b176c6-images\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.630365 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.630744 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77548ba9-d52a-4585-984e-e08c45a58aec-etcd-client\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.632437 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba59400b-2ce1-489d-a70d-747f23b176c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.632819 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-etcd-client\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.633371 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67274805-ff68-4381-b1b6-9a6fbf85aca5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.633503 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.633685 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.633755 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.634557 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-csgxx"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.634855 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.637241 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.640028 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.640050 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.640060 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.641511 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.644340 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.644499 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9m58d"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.651110 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgk2c"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.654792 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.656039 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.656908 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-j8hqs"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.659811 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5n46l"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.660750 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.660862 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.661961 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.663817 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.665452 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5n46l"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.667209 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.667444 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.668261 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w6vtn"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.670287 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.672010 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6s884"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.673330 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.674282 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.675037 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.676429 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.677909 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-config\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.678449 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xjzz7"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.680154 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.682958 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.684563 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m8gtz"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.686105 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w529g"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.686196 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.688433 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dvzgt"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.688602 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-67xwd"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.690461 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.692379 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hxqsd"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.692922 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.693402 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m8gtz"] Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.694564 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.698052 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xkd\" (UniqueName: \"kubernetes.io/projected/9fd63e72-276a-43fe-8927-da5aba5b7a98-kube-api-access-r2xkd\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.698086 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brpvt\" (UniqueName: \"kubernetes.io/projected/dd5794df-cde0-4881-921f-9ba7006d4281-kube-api-access-brpvt\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.698248 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhq9\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-kube-api-access-nxhq9\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.698281 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.713618 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.733777 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.744357 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-metrics-tls\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.754140 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.773538 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.786640 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/67274805-ff68-4381-b1b6-9a6fbf85aca5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.793436 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.834143 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.855133 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.873280 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.877027 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26c685e0-8c5d-4fdc-a5f4-8b746d285813-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.894087 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.914224 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.921835 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26c685e0-8c5d-4fdc-a5f4-8b746d285813-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.934143 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.953438 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.959161 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-default-certificate\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.974711 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.987418 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-stats-auth\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:36 crc kubenswrapper[4681]: I1007 17:05:36.996260 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.005442 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-metrics-certs\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.014413 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.022410 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-service-ca-bundle\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.034063 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.054296 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.074437 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.094112 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.113382 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.134720 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.147962 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/005cd89d-7e4e-4bed-aa11-e4d6f871710b-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.154150 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.176092 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.201799 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.203938 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/005cd89d-7e4e-4bed-aa11-e4d6f871710b-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.213928 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.234490 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.254086 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.274489 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.293437 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.300405 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.314252 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.334127 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.353782 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.373800 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.388604 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.394117 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.403636 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-config\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.414166 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.434120 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.453468 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.467575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd5794df-cde0-4881-921f-9ba7006d4281-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.472390 4681 request.go:700] Waited for 1.010871218s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.473799 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.493638 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.514290 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.527781 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.535298 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.545348 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.554054 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.574394 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.593688 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.602270 4681 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.602321 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config podName:837aa149-aced-4911-bdf3-c25e502dc542 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.10230714 +0000 UTC m=+141.749718685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config") pod "service-ca-operator-777779d784-67xwd" (UID: "837aa149-aced-4911-bdf3-c25e502dc542") : failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.606229 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d181d51e-2df2-4025-b7aa-282418e6c9da-proxy-tls\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.610037 4681 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.610117 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls podName:9fd63e72-276a-43fe-8927-da5aba5b7a98 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.110097396 +0000 UTC m=+141.757508971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls") pod "machine-config-controller-84d6567774-mjns4" (UID: "9fd63e72-276a-43fe-8927-da5aba5b7a98") : failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.610174 4681 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.610212 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume podName:abc28b46-2a9f-4141-8e65-a9c956e0f261 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.110200159 +0000 UTC m=+141.757611734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume") pod "collect-profiles-29330940-w7fp5" (UID: "abc28b46-2a9f-4141-8e65-a9c956e0f261") : failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.611136 4681 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.611185 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert podName:837aa149-aced-4911-bdf3-c25e502dc542 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.111173718 +0000 UTC m=+141.758585273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert") pod "service-ca-operator-777779d784-67xwd" (UID: "837aa149-aced-4911-bdf3-c25e502dc542") : failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.612314 4681 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.612359 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images podName:d181d51e-2df2-4025-b7aa-282418e6c9da nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.112347231 +0000 UTC m=+141.759758786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images") pod "machine-config-operator-74547568cd-6s884" (UID: "d181d51e-2df2-4025-b7aa-282418e6c9da") : failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.612895 4681 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.612962 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls podName:4ee9a95e-d102-45ea-a77b-711de5bd03a9 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.112941398 +0000 UTC m=+141.760352963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls") pod "dns-default-dvzgt" (UID: "4ee9a95e-d102-45ea-a77b-711de5bd03a9") : failed to sync secret cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.614128 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.619465 4681 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: E1007 17:05:37.619555 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume podName:4ee9a95e-d102-45ea-a77b-711de5bd03a9 nodeName:}" failed. No retries permitted until 2025-10-07 17:05:38.11951745 +0000 UTC m=+141.766929015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume") pod "dns-default-dvzgt" (UID: "4ee9a95e-d102-45ea-a77b-711de5bd03a9") : failed to sync configmap cache: timed out waiting for the condition Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.634701 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.653830 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.674281 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.693414 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.713448 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.735285 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.753356 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.773481 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.793785 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.817018 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.833019 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.853664 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.874091 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.893551 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.913553 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.939251 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.990368 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 17:05:37 crc kubenswrapper[4681]: I1007 17:05:37.993865 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.038422 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lswx9\" (UniqueName: \"kubernetes.io/projected/f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c-kube-api-access-lswx9\") pod \"etcd-operator-b45778765-xjzz7\" (UID: \"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.057834 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbp5t\" (UniqueName: \"kubernetes.io/projected/a5b108e0-c0b3-442b-82c7-4ec003e3de22-kube-api-access-fbp5t\") pod \"openshift-controller-manager-operator-756b6f6bc6-ck2fw\" (UID: \"a5b108e0-c0b3-442b-82c7-4ec003e3de22\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.071861 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx96w\" (UniqueName: \"kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w\") pod \"console-f9d7485db-nds8d\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.093847 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.094572 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsfp7\" (UniqueName: \"kubernetes.io/projected/77548ba9-d52a-4585-984e-e08c45a58aec-kube-api-access-xsfp7\") pod \"apiserver-76f77b778f-rgk2c\" (UID: \"77548ba9-d52a-4585-984e-e08c45a58aec\") " pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.120349 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.120425 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.120583 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.120609 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.121388 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d181d51e-2df2-4025-b7aa-282418e6c9da-images\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.121468 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.121665 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.121737 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.121851 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.122420 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ee9a95e-d102-45ea-a77b-711de5bd03a9-config-volume\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.123156 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837aa149-aced-4911-bdf3-c25e502dc542-config\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.123568 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837aa149-aced-4911-bdf3-c25e502dc542-serving-cert\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.125038 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ee9a95e-d102-45ea-a77b-711de5bd03a9-metrics-tls\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.125117 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd63e72-276a-43fe-8927-da5aba5b7a98-proxy-tls\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.129943 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lvv\" (UniqueName: \"kubernetes.io/projected/cb20182c-f315-41b3-94e2-256dac142821-kube-api-access-k4lvv\") pod \"machine-approver-56656f9798-6gnnm\" (UID: \"cb20182c-f315-41b3-94e2-256dac142821\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.149832 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr6z\" (UniqueName: \"kubernetes.io/projected/d712e0bd-952b-4cba-8d92-2c6e72f6b867-kube-api-access-jqr6z\") pod \"apiserver-7bbb656c7d-6jzt8\" (UID: \"d712e0bd-952b-4cba-8d92-2c6e72f6b867\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.181300 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.183533 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pzks\" (UniqueName: \"kubernetes.io/projected/ba59400b-2ce1-489d-a70d-747f23b176c6-kube-api-access-4pzks\") pod \"machine-api-operator-5694c8668f-tg8wr\" (UID: \"ba59400b-2ce1-489d-a70d-747f23b176c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.186009 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.192053 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.193840 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.200184 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.214310 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.238310 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.240798 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.262606 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.277675 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.288299 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.299639 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.320616 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfg79\" (UniqueName: \"kubernetes.io/projected/3017a611-cb0d-4f79-b6f8-2634dc026e2e-kube-api-access-tfg79\") pod \"cluster-samples-operator-665b6dd947-dbxkf\" (UID: \"3017a611-cb0d-4f79-b6f8-2634dc026e2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.332167 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdjv\" (UniqueName: \"kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv\") pod \"oauth-openshift-558db77b4-tw9ww\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.354057 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbvx\" (UniqueName: \"kubernetes.io/projected/468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2-kube-api-access-mmbvx\") pod \"openshift-apiserver-operator-796bbdcf4f-6wcpl\" (UID: \"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.371667 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpx2\" (UniqueName: \"kubernetes.io/projected/c669d56a-7d2e-4161-ac70-29d72a747038-kube-api-access-fcpx2\") pod \"downloads-7954f5f757-j8hqs\" (UID: \"c669d56a-7d2e-4161-ac70-29d72a747038\") " pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.389600 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffpz2\" (UniqueName: \"kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2\") pod \"controller-manager-879f6c89f-fst86\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.413422 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhkj\" (UniqueName: \"kubernetes.io/projected/ca03f8ab-38f2-4aea-9b61-54526e3c5015-kube-api-access-qvhkj\") pod \"authentication-operator-69f744f599-dcg8h\" (UID: \"ca03f8ab-38f2-4aea-9b61-54526e3c5015\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.433747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcqlw\" (UniqueName: \"kubernetes.io/projected/2fdbba2d-2b9f-47f3-a618-f1284f5bce5b-kube-api-access-dcqlw\") pod \"console-operator-58897d9998-csgxx\" (UID: \"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b\") " pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.445150 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.453438 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrn7\" (UniqueName: \"kubernetes.io/projected/d181d51e-2df2-4025-b7aa-282418e6c9da-kube-api-access-chrn7\") pod \"machine-config-operator-74547568cd-6s884\" (UID: \"d181d51e-2df2-4025-b7aa-282418e6c9da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.456023 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.465287 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.469306 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.472543 4681 request.go:700] Waited for 1.871254029s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/serviceaccounts/dns-operator/token Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.477575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwm2\" (UniqueName: \"kubernetes.io/projected/837aa149-aced-4911-bdf3-c25e502dc542-kube-api-access-glwm2\") pod \"service-ca-operator-777779d784-67xwd\" (UID: \"837aa149-aced-4911-bdf3-c25e502dc542\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.478727 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.480196 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.491478 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.493987 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szlk7\" (UniqueName: \"kubernetes.io/projected/2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14-kube-api-access-szlk7\") pod \"dns-operator-744455d44c-9m58d\" (UID: \"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14\") " pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.516735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbbf\" (UniqueName: \"kubernetes.io/projected/1af973a2-4ae5-4a2e-9d36-941f8689054b-kube-api-access-8tbbf\") pod \"migrator-59844c95c7-lq6nd\" (UID: \"1af973a2-4ae5-4a2e-9d36-941f8689054b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.519093 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.524191 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.540975 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5g6r\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-kube-api-access-x5g6r\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.558745 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xjzz7"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.560729 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbbbg\" (UniqueName: \"kubernetes.io/projected/1ae45842-477f-4cc6-9ff7-6c38b866e8f9-kube-api-access-qbbbg\") pod \"router-default-5444994796-hcltk\" (UID: \"1ae45842-477f-4cc6-9ff7-6c38b866e8f9\") " pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.573757 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.576176 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.588629 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.597961 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67274805-ff68-4381-b1b6-9a6fbf85aca5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gxzkf\" (UID: \"67274805-ff68-4381-b1b6-9a6fbf85aca5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.599542 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tg8wr"] Oct 07 17:05:38 crc kubenswrapper[4681]: W1007 17:05:38.608690 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ad8df9_8ba5_4f07_ae44_c2e1addfff5c.slice/crio-746dadd9ce0539e5e838f261da7586cab15423ad88291aa89a16d53ea9136e47 WatchSource:0}: Error finding container 746dadd9ce0539e5e838f261da7586cab15423ad88291aa89a16d53ea9136e47: Status 404 returned error can't find the container with id 746dadd9ce0539e5e838f261da7586cab15423ad88291aa89a16d53ea9136e47 Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.607872 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.613388 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdh46\" (UniqueName: \"kubernetes.io/projected/2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62-kube-api-access-tdh46\") pod \"kube-storage-version-migrator-operator-b67b599dd-q7njt\" (UID: \"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.638420 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26c685e0-8c5d-4fdc-a5f4-8b746d285813-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-rwphq\" (UID: \"26c685e0-8c5d-4fdc-a5f4-8b746d285813\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.650406 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.678610 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g55m\" (UniqueName: \"kubernetes.io/projected/4ee9a95e-d102-45ea-a77b-711de5bd03a9-kube-api-access-4g55m\") pod \"dns-default-dvzgt\" (UID: \"4ee9a95e-d102-45ea-a77b-711de5bd03a9\") " pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.678989 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1746a4d0-d93b-40e4-bd79-a14425d1a9cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4g7gf\" (UID: \"1746a4d0-d93b-40e4-bd79-a14425d1a9cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.707846 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsljs\" (UniqueName: \"kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs\") pod \"collect-profiles-29330940-w7fp5\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.721711 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" event={"ID":"ba59400b-2ce1-489d-a70d-747f23b176c6","Type":"ContainerStarted","Data":"9d5d195aba32a0da2484f898676fd119d7efae8ace64c60ef526d6273132a79c"} Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.722519 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" event={"ID":"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c","Type":"ContainerStarted","Data":"746dadd9ce0539e5e838f261da7586cab15423ad88291aa89a16d53ea9136e47"} Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.723282 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" event={"ID":"a5b108e0-c0b3-442b-82c7-4ec003e3de22","Type":"ContainerStarted","Data":"4df1559f8cf52acc84a933f1a08e3cfa6a818c0e0de4ee3aa2a4057dd45a1983"} Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.724025 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" event={"ID":"cb20182c-f315-41b3-94e2-256dac142821","Type":"ContainerStarted","Data":"a599bd4a390ad08d2cd5fa4f25b33f16ac228cceb6f16385f6dcefd15492988b"} Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.724787 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9sj6\" (UniqueName: \"kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6\") pod \"route-controller-manager-6576b87f9c-26ntm\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.732408 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvklw\" (UniqueName: \"kubernetes.io/projected/a96ffd28-b774-40e1-ad52-e6fa63483f1d-kube-api-access-kvklw\") pod \"openshift-config-operator-7777fb866f-hgrnc\" (UID: \"a96ffd28-b774-40e1-ad52-e6fa63483f1d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.736061 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.740587 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.753847 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.779291 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.787622 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rgk2c"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.804854 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.807237 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.814422 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.815691 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.820132 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.841439 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.844406 4681 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.844677 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.850729 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.857235 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.865511 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.874001 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.877986 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl"] Oct 07 17:05:38 crc kubenswrapper[4681]: W1007 17:05:38.877984 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd712e0bd_952b_4cba_8d92_2c6e72f6b867.slice/crio-ddc5b1c34abf7e3446594d2ce9d889ee3cc619195102d04228d270f83102ee41 WatchSource:0}: Error finding container ddc5b1c34abf7e3446594d2ce9d889ee3cc619195102d04228d270f83102ee41: Status 404 returned error can't find the container with id ddc5b1c34abf7e3446594d2ce9d889ee3cc619195102d04228d270f83102ee41 Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.892787 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.893184 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.911835 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf"] Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.914933 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.933537 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.938120 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.958029 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xkd\" (UniqueName: \"kubernetes.io/projected/9fd63e72-276a-43fe-8927-da5aba5b7a98-kube-api-access-r2xkd\") pod \"machine-config-controller-84d6567774-mjns4\" (UID: \"9fd63e72-276a-43fe-8927-da5aba5b7a98\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.968775 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brpvt\" (UniqueName: \"kubernetes.io/projected/dd5794df-cde0-4881-921f-9ba7006d4281-kube-api-access-brpvt\") pod \"control-plane-machine-set-operator-78cbb6b69f-6zl9n\" (UID: \"dd5794df-cde0-4881-921f-9ba7006d4281\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:38 crc kubenswrapper[4681]: I1007 17:05:38.989198 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhq9\" (UniqueName: \"kubernetes.io/projected/005cd89d-7e4e-4bed-aa11-e4d6f871710b-kube-api-access-nxhq9\") pod \"ingress-operator-5b745b69d9-dcxsj\" (UID: \"005cd89d-7e4e-4bed-aa11-e4d6f871710b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.009987 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod468bcce5_4e0a_45dd_aeda_fe1a8fd0fea2.slice/crio-e08f00c85967f32339d162edade960032afced3aa011728f83e8cd2f4f8ddb18 WatchSource:0}: Error finding container e08f00c85967f32339d162edade960032afced3aa011728f83e8cd2f4f8ddb18: Status 404 returned error can't find the container with id e08f00c85967f32339d162edade960032afced3aa011728f83e8cd2f4f8ddb18 Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.019391 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60fcc2ef-a564-4e1e-947d-1dd672c4ced7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ns9f4\" (UID: \"60fcc2ef-a564-4e1e-947d-1dd672c4ced7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047288 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cd7e753-b687-49de-83ec-55f1fc315007-signing-cabundle\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047482 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cd7e753-b687-49de-83ec-55f1fc315007-signing-key\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047601 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmzk\" (UniqueName: \"kubernetes.io/projected/77d83873-a7b2-42d5-a94e-d7bfc4784cba-kube-api-access-9cmzk\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047737 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047819 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.047945 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048178 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048257 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-srv-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048343 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78sn8\" (UniqueName: \"kubernetes.io/projected/8cd7e753-b687-49de-83ec-55f1fc315007-kube-api-access-78sn8\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048429 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048563 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-profile-collector-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048640 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmsb\" (UniqueName: \"kubernetes.io/projected/d38a6e43-b412-4c01-bf04-1036f9d6942c-kube-api-access-ztmsb\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048704 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnmtf\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048787 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.048932 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d38a6e43-b412-4c01-bf04-1036f9d6942c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.049023 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.049599 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:39.549507442 +0000 UTC m=+143.196918987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.149658 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.149914 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.150030 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f926p\" (UniqueName: \"kubernetes.io/projected/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-kube-api-access-f926p\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.150084 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.155608 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156073 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156139 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156162 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-plugins-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156226 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b45f921-126b-4677-889b-67da1ba1840e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156279 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-srv-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156298 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-csi-data-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156330 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78sn8\" (UniqueName: \"kubernetes.io/projected/8cd7e753-b687-49de-83ec-55f1fc315007-kube-api-access-78sn8\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156375 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-socket-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156398 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156437 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-mountpoint-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156465 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-srv-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156528 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156592 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw947\" (UniqueName: \"kubernetes.io/projected/0b45f921-126b-4677-889b-67da1ba1840e-kube-api-access-bw947\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156637 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-profile-collector-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156666 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmsb\" (UniqueName: \"kubernetes.io/projected/d38a6e43-b412-4c01-bf04-1036f9d6942c-kube-api-access-ztmsb\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156693 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnmtf\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156756 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzgk\" (UniqueName: \"kubernetes.io/projected/365d94e6-7b9f-49bb-92db-f5a232406e10-kube-api-access-njzgk\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156797 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156826 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156842 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-node-bootstrap-token\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156858 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-tmpfs\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.156917 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157010 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-certs\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157037 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-registration-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157085 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157102 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365d94e6-7b9f-49bb-92db-f5a232406e10-cert\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157123 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d38a6e43-b412-4c01-bf04-1036f9d6942c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157193 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qh4\" (UniqueName: \"kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157229 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sns9\" (UniqueName: \"kubernetes.io/projected/10e7236c-b3fa-4c0c-9d1f-068374e029cb-kube-api-access-8sns9\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157274 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-webhook-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157312 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cd7e753-b687-49de-83ec-55f1fc315007-signing-cabundle\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157328 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmv7\" (UniqueName: \"kubernetes.io/projected/7a79ff15-67a5-43e6-a92d-84c3168db81b-kube-api-access-4hmv7\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157345 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg7sn\" (UniqueName: \"kubernetes.io/projected/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-kube-api-access-zg7sn\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157360 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cd7e753-b687-49de-83ec-55f1fc315007-signing-key\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157406 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmzk\" (UniqueName: \"kubernetes.io/projected/77d83873-a7b2-42d5-a94e-d7bfc4784cba-kube-api-access-9cmzk\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.157458 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.162848 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.164613 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cd7e753-b687-49de-83ec-55f1fc315007-signing-cabundle\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.167231 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.172500 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:39.67247406 +0000 UTC m=+143.319885605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.176145 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.179937 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-profile-collector-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.180172 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.182417 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.184075 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77d83873-a7b2-42d5-a94e-d7bfc4784cba-srv-cert\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.184344 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cd7e753-b687-49de-83ec-55f1fc315007-signing-key\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.194429 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d38a6e43-b412-4c01-bf04-1036f9d6942c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.194476 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.208672 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.216972 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-csgxx"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.219030 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.235329 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.245257 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78sn8\" (UniqueName: \"kubernetes.io/projected/8cd7e753-b687-49de-83ec-55f1fc315007-kube-api-access-78sn8\") pod \"service-ca-9c57cc56f-w529g\" (UID: \"8cd7e753-b687-49de-83ec-55f1fc315007\") " pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.245483 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-w529g" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.249834 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmsb\" (UniqueName: \"kubernetes.io/projected/d38a6e43-b412-4c01-bf04-1036f9d6942c-kube-api-access-ztmsb\") pod \"multus-admission-controller-857f4d67dd-w6vtn\" (UID: \"d38a6e43-b412-4c01-bf04-1036f9d6942c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258421 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f926p\" (UniqueName: \"kubernetes.io/projected/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-kube-api-access-f926p\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-plugins-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258579 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b45f921-126b-4677-889b-67da1ba1840e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258619 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-csi-data-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258639 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-socket-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258653 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258668 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-mountpoint-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258703 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-srv-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258729 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw947\" (UniqueName: \"kubernetes.io/projected/0b45f921-126b-4677-889b-67da1ba1840e-kube-api-access-bw947\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzgk\" (UniqueName: \"kubernetes.io/projected/365d94e6-7b9f-49bb-92db-f5a232406e10-kube-api-access-njzgk\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258803 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-node-bootstrap-token\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258850 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-tmpfs\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258868 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258912 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-certs\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258928 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-registration-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258943 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258975 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365d94e6-7b9f-49bb-92db-f5a232406e10-cert\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.258998 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qh4\" (UniqueName: \"kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259018 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259035 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sns9\" (UniqueName: \"kubernetes.io/projected/10e7236c-b3fa-4c0c-9d1f-068374e029cb-kube-api-access-8sns9\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-webhook-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259090 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmv7\" (UniqueName: \"kubernetes.io/projected/7a79ff15-67a5-43e6-a92d-84c3168db81b-kube-api-access-4hmv7\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259104 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg7sn\" (UniqueName: \"kubernetes.io/projected/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-kube-api-access-zg7sn\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.259476 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-plugins-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.266335 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b45f921-126b-4677-889b-67da1ba1840e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.266917 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:39.766887381 +0000 UTC m=+143.414298936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.267158 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-certs\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.267214 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-srv-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.267411 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-registration-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.267500 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-apiservice-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.268593 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.270401 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-mountpoint-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.270500 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-socket-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.270568 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7a79ff15-67a5-43e6-a92d-84c3168db81b-csi-data-dir\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.272369 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnmtf\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.272705 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-tmpfs\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.272980 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/365d94e6-7b9f-49bb-92db-f5a232406e10-cert\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.273151 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-webhook-cert\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.273962 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.277048 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-node-bootstrap-token\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.278209 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/10e7236c-b3fa-4c0c-9d1f-068374e029cb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.281413 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dcg8h"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.283450 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-j8hqs"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.306436 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmzk\" (UniqueName: \"kubernetes.io/projected/77d83873-a7b2-42d5-a94e-d7bfc4784cba-kube-api-access-9cmzk\") pod \"catalog-operator-68c6474976-bqww9\" (UID: \"77d83873-a7b2-42d5-a94e-d7bfc4784cba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.329997 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.333411 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f926p\" (UniqueName: \"kubernetes.io/projected/0abf116d-d4aa-4bb8-95ab-430c181d5bdf-kube-api-access-f926p\") pod \"packageserver-d55dfcdfc-lwhwn\" (UID: \"0abf116d-d4aa-4bb8-95ab-430c181d5bdf\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.359971 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qh4\" (UniqueName: \"kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4\") pod \"marketplace-operator-79b997595-hq69t\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.363589 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.364077 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:39.864064211 +0000 UTC m=+143.511475766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.375207 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzgk\" (UniqueName: \"kubernetes.io/projected/365d94e6-7b9f-49bb-92db-f5a232406e10-kube-api-access-njzgk\") pod \"ingress-canary-5n46l\" (UID: \"365d94e6-7b9f-49bb-92db-f5a232406e10\") " pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.413127 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.413536 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9m58d"] Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.438080 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fdbba2d_2b9f_47f3_a618_f1284f5bce5b.slice/crio-3e8ffd4ccb868efa2d93fb953ab79095112c222ee15d79e85783d0755b2602f8 WatchSource:0}: Error finding container 3e8ffd4ccb868efa2d93fb953ab79095112c222ee15d79e85783d0755b2602f8: Status 404 returned error can't find the container with id 3e8ffd4ccb868efa2d93fb953ab79095112c222ee15d79e85783d0755b2602f8 Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.447317 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6s884"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.447364 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.450152 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmv7\" (UniqueName: \"kubernetes.io/projected/7a79ff15-67a5-43e6-a92d-84c3168db81b-kube-api-access-4hmv7\") pod \"csi-hostpathplugin-m8gtz\" (UID: \"7a79ff15-67a5-43e6-a92d-84c3168db81b\") " pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.451500 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw947\" (UniqueName: \"kubernetes.io/projected/0b45f921-126b-4677-889b-67da1ba1840e-kube-api-access-bw947\") pod \"package-server-manager-789f6589d5-7vd82\" (UID: \"0b45f921-126b-4677-889b-67da1ba1840e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.451804 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sns9\" (UniqueName: \"kubernetes.io/projected/10e7236c-b3fa-4c0c-9d1f-068374e029cb-kube-api-access-8sns9\") pod \"olm-operator-6b444d44fb-x8lr7\" (UID: \"10e7236c-b3fa-4c0c-9d1f-068374e029cb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.460201 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.465169 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg7sn\" (UniqueName: \"kubernetes.io/projected/fb2b7d26-6891-4f5d-9765-5b8922abd5cd-kube-api-access-zg7sn\") pod \"machine-config-server-hxqsd\" (UID: \"fb2b7d26-6891-4f5d-9765-5b8922abd5cd\") " pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.465551 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.466115 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:39.966098712 +0000 UTC m=+143.613510267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.488553 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.489536 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-67xwd"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.490846 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.494989 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.551374 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.563907 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.564306 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.568480 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.569077 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.069054001 +0000 UTC m=+143.716465566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.576018 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.582095 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.584155 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc669d56a_7d2e_4161_ac70_29d72a747038.slice/crio-d7e6ae6f7f1a364a0748148ad63b983241193d9e4cfd84bb3f8a6d6786a0d2e3 WatchSource:0}: Error finding container d7e6ae6f7f1a364a0748148ad63b983241193d9e4cfd84bb3f8a6d6786a0d2e3: Status 404 returned error can't find the container with id d7e6ae6f7f1a364a0748148ad63b983241193d9e4cfd84bb3f8a6d6786a0d2e3 Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.588183 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5n46l" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.613150 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.613544 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxqsd" Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.670000 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.670340 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.17032877 +0000 UTC m=+143.817740325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.727714 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1746a4d0_d93b_40e4_bd79_a14425d1a9cc.slice/crio-f76b4b78ac91d0c37205af048c841d1dc7b6b5e698d74435a2b2f4b188405bf3 WatchSource:0}: Error finding container f76b4b78ac91d0c37205af048c841d1dc7b6b5e698d74435a2b2f4b188405bf3: Status 404 returned error can't find the container with id f76b4b78ac91d0c37205af048c841d1dc7b6b5e698d74435a2b2f4b188405bf3 Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.788145 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.789814 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.289793166 +0000 UTC m=+143.937204721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.797700 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.798188 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" event={"ID":"1af973a2-4ae5-4a2e-9d36-941f8689054b","Type":"ContainerStarted","Data":"cfbcff69f6d005e87fd1a7167c37a3ab10962397721a7bc556a00d9b436d82bd"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.838858 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.841990 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nds8d" event={"ID":"dff981f7-635e-4b45-bf64-fbb57407582b","Type":"ContainerStarted","Data":"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.842028 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nds8d" event={"ID":"dff981f7-635e-4b45-bf64-fbb57407582b","Type":"ContainerStarted","Data":"16e27d1919847670c69052d109897b8b3250074e3ca585a5aa77da7b9c890275"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.847596 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dvzgt"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.884107 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" event={"ID":"d712e0bd-952b-4cba-8d92-2c6e72f6b867","Type":"ContainerStarted","Data":"ddc5b1c34abf7e3446594d2ce9d889ee3cc619195102d04228d270f83102ee41"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.891518 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.894015 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.394001422 +0000 UTC m=+144.041412977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.912421 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj"] Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.929531 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-j8hqs" event={"ID":"c669d56a-7d2e-4161-ac70-29d72a747038","Type":"ContainerStarted","Data":"d7e6ae6f7f1a364a0748148ad63b983241193d9e4cfd84bb3f8a6d6786a0d2e3"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.954177 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hcltk" event={"ID":"1ae45842-477f-4cc6-9ff7-6c38b866e8f9","Type":"ContainerStarted","Data":"404e482a98489485996255b557f46929cf4aff59c93f9be4bf825ba87f8904d6"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.954212 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hcltk" event={"ID":"1ae45842-477f-4cc6-9ff7-6c38b866e8f9","Type":"ContainerStarted","Data":"84ef0e7a777c0553ca426d7ae324013f6a2b7c3ad7be01a29484f9e16565eaec"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.958096 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" event={"ID":"c2c64f34-b460-412c-b82e-2dbc6c93444e","Type":"ContainerStarted","Data":"92e0324ea206585f6cd3112c710ea9922079e3c3b633afd3dcd230d8db41721e"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.964908 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" event={"ID":"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2","Type":"ContainerStarted","Data":"3221e7d31858b145e24e006ec26615be3deeaa6ce85ba213af9d665d2980efcf"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.964951 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" event={"ID":"468bcce5-4e0a-45dd-aeda-fe1a8fd0fea2","Type":"ContainerStarted","Data":"e08f00c85967f32339d162edade960032afced3aa011728f83e8cd2f4f8ddb18"} Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.969667 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabc28b46_2a9f_4141_8e65_a9c956e0f261.slice/crio-c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d WatchSource:0}: Error finding container c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d: Status 404 returned error can't find the container with id c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.970018 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" event={"ID":"77548ba9-d52a-4585-984e-e08c45a58aec","Type":"ContainerStarted","Data":"f6530e554329f7270e0ef341372004cf1fc37dbc460b5914457c4ea410237eb8"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.977043 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" event={"ID":"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14","Type":"ContainerStarted","Data":"f54a9344c99cedada81a2dcea158d3c5402ba57d3fd347bf7a8c6557406df4b6"} Oct 07 17:05:39 crc kubenswrapper[4681]: W1007 17:05:39.982747 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee9a95e_d102_45ea_a77b_711de5bd03a9.slice/crio-f859ceae550922b023c9a310e4820a380094d8566e7edcfdd6a4fa8d700932be WatchSource:0}: Error finding container f859ceae550922b023c9a310e4820a380094d8566e7edcfdd6a4fa8d700932be: Status 404 returned error can't find the container with id f859ceae550922b023c9a310e4820a380094d8566e7edcfdd6a4fa8d700932be Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.994112 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" event={"ID":"ba59400b-2ce1-489d-a70d-747f23b176c6","Type":"ContainerStarted","Data":"0fa032330713450df3d79c4473396938318dcad0e1d3349e4366cdd1a40f064c"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.994150 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" event={"ID":"ba59400b-2ce1-489d-a70d-747f23b176c6","Type":"ContainerStarted","Data":"2f8d3311f355cf20275a545b0d2c3a45b609c865cd9dc37ab033f04a8ceac80b"} Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.995543 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:39 crc kubenswrapper[4681]: E1007 17:05:39.995828 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.495789655 +0000 UTC m=+144.143201210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:39 crc kubenswrapper[4681]: I1007 17:05:39.996498 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" event={"ID":"f8ad8df9-8ba5-4f07-ae44-c2e1addfff5c","Type":"ContainerStarted","Data":"8a6ac5bc24184fa9a31a07fddcca984d2ec1b73918af8cc963a0fdab622e0f2a"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.057490 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" event={"ID":"a5b108e0-c0b3-442b-82c7-4ec003e3de22","Type":"ContainerStarted","Data":"8074caf65e982c0bbea5436c099350cbff3c7eeededb0cb6653bd171e1365dbf"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.081435 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" event={"ID":"cb20182c-f315-41b3-94e2-256dac142821","Type":"ContainerStarted","Data":"dfd224314363150497ea7fdf59331e8bce3f749262c6e6f43e17c8a177b31525"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.082685 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-csgxx" event={"ID":"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b","Type":"ContainerStarted","Data":"3e8ffd4ccb868efa2d93fb953ab79095112c222ee15d79e85783d0755b2602f8"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.097290 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" event={"ID":"ca03f8ab-38f2-4aea-9b61-54526e3c5015","Type":"ContainerStarted","Data":"c91a6a663a490784a391bc0bbcc28e32c1919ff744f37239932b49507b008c7e"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.097915 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.098591 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.100369 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.60035365 +0000 UTC m=+144.247765205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.104821 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.115707 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" event={"ID":"0d8e31b0-652c-44ff-99b0-04ae7d329f6f","Type":"ContainerStarted","Data":"8f55dc7cfbe5e9300f49fdba8b028e82f4d16158c752b042caae4733151fa886"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.118631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" event={"ID":"3017a611-cb0d-4f79-b6f8-2634dc026e2e","Type":"ContainerStarted","Data":"51c7f6f370b99f27e1e1769e32f727ad656a80470c40a38adb60c67301dc76f9"} Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.167704 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.187029 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.205270 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.205510 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.705489392 +0000 UTC m=+144.352900947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.205554 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.206275 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.706267254 +0000 UTC m=+144.353678809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.308409 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.308533 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.808504702 +0000 UTC m=+144.455916257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.308684 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.308976 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.808965045 +0000 UTC m=+144.456376600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: W1007 17:05:40.357520 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b45f921_126b_4677_889b_67da1ba1840e.slice/crio-bcc02b137a7648c2001fdc50ced4d01aa321ee14d276d8eac32e82db71baad11 WatchSource:0}: Error finding container bcc02b137a7648c2001fdc50ced4d01aa321ee14d276d8eac32e82db71baad11: Status 404 returned error can't find the container with id bcc02b137a7648c2001fdc50ced4d01aa321ee14d276d8eac32e82db71baad11 Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.392747 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.412619 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.415278 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:40.912904711 +0000 UTC m=+144.560316266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.480275 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-w529g"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.516073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.517555 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.017518687 +0000 UTC m=+144.664930242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.618766 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.621675 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.121655499 +0000 UTC m=+144.769067054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.621797 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m8gtz"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.629217 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:05:40 crc kubenswrapper[4681]: W1007 17:05:40.644689 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60fcc2ef_a564_4e1e_947d_1dd672c4ced7.slice/crio-d219ba1b6c33d934635ef33855bd8cd88fc597512b7e9f49e592909c2f0428de WatchSource:0}: Error finding container d219ba1b6c33d934635ef33855bd8cd88fc597512b7e9f49e592909c2f0428de: Status 404 returned error can't find the container with id d219ba1b6c33d934635ef33855bd8cd88fc597512b7e9f49e592909c2f0428de Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.726793 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.727196 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.227178552 +0000 UTC m=+144.874590107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.749980 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5n46l"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.799450 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.829711 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.830008 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.329987146 +0000 UTC m=+144.977398701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.830813 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.837476 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.337458462 +0000 UTC m=+144.984870017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.851817 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.862053 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:40 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:40 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:40 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.862103 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.932211 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.932662 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.432636425 +0000 UTC m=+145.080047980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.935955 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.940275 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:40 crc kubenswrapper[4681]: E1007 17:05:40.940681 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.440665108 +0000 UTC m=+145.088076663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.949761 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9"] Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.974365 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xjzz7" podStartSLOduration=123.974332765 podStartE2EDuration="2m3.974332765s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:40.971841673 +0000 UTC m=+144.619253228" watchObservedRunningTime="2025-10-07 17:05:40.974332765 +0000 UTC m=+144.621744320" Oct 07 17:05:40 crc kubenswrapper[4681]: I1007 17:05:40.978520 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-w6vtn"] Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.032815 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6wcpl" podStartSLOduration=125.032800782 podStartE2EDuration="2m5.032800782s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.011795153 +0000 UTC m=+144.659206708" watchObservedRunningTime="2025-10-07 17:05:41.032800782 +0000 UTC m=+144.680212337" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.033905 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ck2fw" podStartSLOduration=125.033901494 podStartE2EDuration="2m5.033901494s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.031932227 +0000 UTC m=+144.679343792" watchObservedRunningTime="2025-10-07 17:05:41.033901494 +0000 UTC m=+144.681313039" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.041932 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.042208 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.542193995 +0000 UTC m=+145.189605550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.081031 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nds8d" podStartSLOduration=125.081012991 podStartE2EDuration="2m5.081012991s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.08063146 +0000 UTC m=+144.728043015" watchObservedRunningTime="2025-10-07 17:05:41.081012991 +0000 UTC m=+144.728424546" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.155200 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.155609 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.655596166 +0000 UTC m=+145.303007721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.161123 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hcltk" podStartSLOduration=124.161108676 podStartE2EDuration="2m4.161108676s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.160477458 +0000 UTC m=+144.807889013" watchObservedRunningTime="2025-10-07 17:05:41.161108676 +0000 UTC m=+144.808520241" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.166570 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" podStartSLOduration=125.166555484 podStartE2EDuration="2m5.166555484s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.115078001 +0000 UTC m=+144.762489556" watchObservedRunningTime="2025-10-07 17:05:41.166555484 +0000 UTC m=+144.813967039" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.218628 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" event={"ID":"0d8e31b0-652c-44ff-99b0-04ae7d329f6f","Type":"ContainerStarted","Data":"9546f84c29a5e91e6d3fa5541f3809ad3574bf869f88501ff464ba6b851f882d"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.219591 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.221741 4681 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fst86 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.221852 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.242515 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tg8wr" podStartSLOduration=124.242501708 podStartE2EDuration="2m4.242501708s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.239943814 +0000 UTC m=+144.887355369" watchObservedRunningTime="2025-10-07 17:05:41.242501708 +0000 UTC m=+144.889913263" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.254249 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" event={"ID":"dd5794df-cde0-4881-921f-9ba7006d4281","Type":"ContainerStarted","Data":"740e211c286430f9ef1ac4807e224fffca9d2b713c0148b1969110e7db3fff65"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.259956 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.260254 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.760238983 +0000 UTC m=+145.407650538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.267489 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" event={"ID":"0b45f921-126b-4677-889b-67da1ba1840e","Type":"ContainerStarted","Data":"bcc02b137a7648c2001fdc50ced4d01aa321ee14d276d8eac32e82db71baad11"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.286191 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" podStartSLOduration=124.286174396 podStartE2EDuration="2m4.286174396s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.282304683 +0000 UTC m=+144.929716238" watchObservedRunningTime="2025-10-07 17:05:41.286174396 +0000 UTC m=+144.933585951" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.292055 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" event={"ID":"b9b8f6ae-aa50-4e7a-a59e-359b18fada73","Type":"ContainerStarted","Data":"7463f0e6e7d4975d0033d4e3f5105e1d6f2dd97eaeaa6484480ecf6acfe96dc9"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.304029 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxqsd" event={"ID":"fb2b7d26-6891-4f5d-9765-5b8922abd5cd","Type":"ContainerStarted","Data":"00aec82681647eae62c4915235a32c6bb7d93270f00cf6b70efefa6c472c577a"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.315353 4681 generic.go:334] "Generic (PLEG): container finished" podID="d712e0bd-952b-4cba-8d92-2c6e72f6b867" containerID="4a34798e6f7b71ad9edc3cd4fc95559bc8ddbd55ef4cd3ccf838e0e2bb55af78" exitCode=0 Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.317751 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" event={"ID":"d712e0bd-952b-4cba-8d92-2c6e72f6b867","Type":"ContainerDied","Data":"4a34798e6f7b71ad9edc3cd4fc95559bc8ddbd55ef4cd3ccf838e0e2bb55af78"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.323682 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-w529g" event={"ID":"8cd7e753-b687-49de-83ec-55f1fc315007","Type":"ContainerStarted","Data":"7f31e4cbfe419666f6a730d044554e1e1b3a7fe970624d8897776a6ebb364a1a"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.334656 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" event={"ID":"d181d51e-2df2-4025-b7aa-282418e6c9da","Type":"ContainerStarted","Data":"a06ca1502285a81f750ecfe390bdfdda1d32e9632b2d01c687a131c510c22bb4"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.334915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" event={"ID":"d181d51e-2df2-4025-b7aa-282418e6c9da","Type":"ContainerStarted","Data":"33e9ee053367302e30dfff2b4a44b4d5c55dcb299810363e0877a14a86f9fe6d"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.338028 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" event={"ID":"7a79ff15-67a5-43e6-a92d-84c3168db81b","Type":"ContainerStarted","Data":"12c5bf46bed0542a3610401e1b2709c13d3144c8f6bb99e68b21b9ad97e4168b"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.342415 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5n46l" event={"ID":"365d94e6-7b9f-49bb-92db-f5a232406e10","Type":"ContainerStarted","Data":"c7dd92824a6c47b1218f4d68e5ae50708e1afa5d79c4978b6c3e8a78bb5a9456"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.344966 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" event={"ID":"837aa149-aced-4911-bdf3-c25e502dc542","Type":"ContainerStarted","Data":"fcba185763908483a22423119b1b959a363db230c2362a004f811b52aaa2eff0"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.344989 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" event={"ID":"837aa149-aced-4911-bdf3-c25e502dc542","Type":"ContainerStarted","Data":"7198ccab3828bc60b500d785880d35e74837a148572c045f44a7625a25f16bd6"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.353439 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" event={"ID":"005cd89d-7e4e-4bed-aa11-e4d6f871710b","Type":"ContainerStarted","Data":"2ddabd92304a8b59187bfb56ab0fc623b9d6eaa7630986813076c269801830c4"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.356326 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" event={"ID":"3017a611-cb0d-4f79-b6f8-2634dc026e2e","Type":"ContainerStarted","Data":"f94585d3de9fff5925815482b16bba316ef1daf4274167f4929a9a783ba331d7"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.359921 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" event={"ID":"ca03f8ab-38f2-4aea-9b61-54526e3c5015","Type":"ContainerStarted","Data":"bbaf0f6f289e659ae42ddefebe4961b1726fd111a97d5352a661ef6bdd12591a"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.361841 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.364284 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.864272483 +0000 UTC m=+145.511684038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.366815 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" event={"ID":"60fcc2ef-a564-4e1e-947d-1dd672c4ced7","Type":"ContainerStarted","Data":"d219ba1b6c33d934635ef33855bd8cd88fc597512b7e9f49e592909c2f0428de"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.371822 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" event={"ID":"a96ffd28-b774-40e1-ad52-e6fa63483f1d","Type":"ContainerStarted","Data":"5a554e630ba66cfc6e54d2b76c2d285d025b3782bc8a430f8a2ecbec150bae48"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.375916 4681 generic.go:334] "Generic (PLEG): container finished" podID="77548ba9-d52a-4585-984e-e08c45a58aec" containerID="f7b85436b7a54e98301126a729bc16becd9b9930d9b2261ee416cb856becf81e" exitCode=0 Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.375943 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" event={"ID":"77548ba9-d52a-4585-984e-e08c45a58aec","Type":"ContainerDied","Data":"f7b85436b7a54e98301126a729bc16becd9b9930d9b2261ee416cb856becf81e"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.384783 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" event={"ID":"26c685e0-8c5d-4fdc-a5f4-8b746d285813","Type":"ContainerStarted","Data":"d19fae8c560c9550c8ef2bbedc089b345c94fd5ebca086e7033f0fb080543511"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.396610 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" event={"ID":"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62","Type":"ContainerStarted","Data":"ef8baa1764cc904cc0d8791fc81471722318d902119dacf4c4335ac91126d9e4"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.404458 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" event={"ID":"0abf116d-d4aa-4bb8-95ab-430c181d5bdf","Type":"ContainerStarted","Data":"d85c5a8f50632491694b506ecf6787402cbb638853f7b593638374954f7a38f3"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.415204 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6gnnm" event={"ID":"cb20182c-f315-41b3-94e2-256dac142821","Type":"ContainerStarted","Data":"5e1a554af6c4f1bf918fefb4c48f14bb9e2c7053979b822bc79083f523a59674"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.433155 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" event={"ID":"10e7236c-b3fa-4c0c-9d1f-068374e029cb","Type":"ContainerStarted","Data":"e518e8817ba1f3849d73babf7ca06fba3d17ddc2e70a0e02f464b21b00fda620"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.464224 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.465559 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:41.965538812 +0000 UTC m=+145.612950367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.487609 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvzgt" event={"ID":"4ee9a95e-d102-45ea-a77b-711de5bd03a9","Type":"ContainerStarted","Data":"f859ceae550922b023c9a310e4820a380094d8566e7edcfdd6a4fa8d700932be"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.495567 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" event={"ID":"77d83873-a7b2-42d5-a94e-d7bfc4784cba","Type":"ContainerStarted","Data":"080494e86f1e2f4292d5b9ee9d19e1a8c7e81abc80217006b75cc03c951b993a"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.497572 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" event={"ID":"1af973a2-4ae5-4a2e-9d36-941f8689054b","Type":"ContainerStarted","Data":"f80fe7eadfbe5d3c19b39a9beb0338584e1ad3705033f8fab5f0057d9c19c01b"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.513788 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" event={"ID":"61783842-70fb-40b1-bc57-f614ca527168","Type":"ContainerStarted","Data":"92a72098a7ac8ed3169de6adac3c31ab6cd5e89ba28d88dd5cb7b32a33efd9a6"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.521833 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" event={"ID":"1746a4d0-d93b-40e4-bd79-a14425d1a9cc","Type":"ContainerStarted","Data":"f76b4b78ac91d0c37205af048c841d1dc7b6b5e698d74435a2b2f4b188405bf3"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.527083 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" event={"ID":"c2c64f34-b460-412c-b82e-2dbc6c93444e","Type":"ContainerStarted","Data":"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.527269 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.529269 4681 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tw9ww container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.37:6443/healthz\": dial tcp 10.217.0.37:6443: connect: connection refused" start-of-body= Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.529310 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.37:6443/healthz\": dial tcp 10.217.0.37:6443: connect: connection refused" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.532460 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" event={"ID":"67274805-ff68-4381-b1b6-9a6fbf85aca5","Type":"ContainerStarted","Data":"a44e26c3cc36be09a398a011b21f94cf5925c3ce4f69e37e3b7eabfaa25d4f8e"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.535532 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" event={"ID":"abc28b46-2a9f-4141-8e65-a9c956e0f261","Type":"ContainerStarted","Data":"86c8f25c68b4cd646ad8682cad9f74bc3d777bdb6ed7b62f93e0ab1890d5d373"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.535579 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" event={"ID":"abc28b46-2a9f-4141-8e65-a9c956e0f261","Type":"ContainerStarted","Data":"c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.538769 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" event={"ID":"9fd63e72-276a-43fe-8927-da5aba5b7a98","Type":"ContainerStarted","Data":"d230e3be15571045b8c830a4b5a8294f2a91786d3e4acec9c7bd06c021238dad"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.547103 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" event={"ID":"d38a6e43-b412-4c01-bf04-1036f9d6942c","Type":"ContainerStarted","Data":"b0c75cd11291806653397f4c9c6ed272e2566de52a12f6b5acf50c578d755d28"} Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.566329 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.568050 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.068033296 +0000 UTC m=+145.715444851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.668765 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.669083 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.169061868 +0000 UTC m=+145.816473423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.670238 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.671253 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.171238161 +0000 UTC m=+145.818649716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.775173 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.775738 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.275723984 +0000 UTC m=+145.923135539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.863921 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:41 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:41 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:41 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.864180 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.881965 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.882524 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.382504363 +0000 UTC m=+146.029915948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.906337 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" podStartSLOduration=125.906305224 podStartE2EDuration="2m5.906305224s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.904258704 +0000 UTC m=+145.551670259" watchObservedRunningTime="2025-10-07 17:05:41.906305224 +0000 UTC m=+145.553716779" Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.990674 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:41 crc kubenswrapper[4681]: E1007 17:05:41.991062 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.491044923 +0000 UTC m=+146.138456478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:41 crc kubenswrapper[4681]: I1007 17:05:41.995678 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dcg8h" podStartSLOduration=125.995660967 podStartE2EDuration="2m5.995660967s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:41.952517214 +0000 UTC m=+145.599928769" watchObservedRunningTime="2025-10-07 17:05:41.995660967 +0000 UTC m=+145.643072522" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.049441 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" podStartSLOduration=126.049422157 podStartE2EDuration="2m6.049422157s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.041799906 +0000 UTC m=+145.689211461" watchObservedRunningTime="2025-10-07 17:05:42.049422157 +0000 UTC m=+145.696833712" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.084445 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" podStartSLOduration=125.084420924 podStartE2EDuration="2m5.084420924s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.075038151 +0000 UTC m=+145.722449706" watchObservedRunningTime="2025-10-07 17:05:42.084420924 +0000 UTC m=+145.731832479" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.092441 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.092849 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.592838098 +0000 UTC m=+146.240249643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.122448 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" podStartSLOduration=125.122428836 podStartE2EDuration="2m5.122428836s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.119629735 +0000 UTC m=+145.767041300" watchObservedRunningTime="2025-10-07 17:05:42.122428836 +0000 UTC m=+145.769840391" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.193294 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.193779 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.693757177 +0000 UTC m=+146.341168732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.199667 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.199729 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.246328 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-67xwd" podStartSLOduration=125.246310911 podStartE2EDuration="2m5.246310911s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.163102507 +0000 UTC m=+145.810514072" watchObservedRunningTime="2025-10-07 17:05:42.246310911 +0000 UTC m=+145.893722466" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.294626 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.294957 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.794939893 +0000 UTC m=+146.442351448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.395572 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.396503 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.89648621 +0000 UTC m=+146.543897765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.498627 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.499326 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:42.999312315 +0000 UTC m=+146.646723870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.599892 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.600241 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.100227134 +0000 UTC m=+146.747638689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.635350 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-q7njt" event={"ID":"2c7ff5e6-7b88-45f4-ab1c-fb8b7a40dc62","Type":"ContainerStarted","Data":"f2d50e957ad5b194e8f0b0448bcfed3015c2501c42c23665e0ec2c9c8c081c2e"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.681593 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" event={"ID":"77548ba9-d52a-4585-984e-e08c45a58aec","Type":"ContainerStarted","Data":"4181ad0ca3c58fc22fa235a15df34cbc0a3e7b1d17cbb7776b4597a3d9af6ec0"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.694301 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-w529g" event={"ID":"8cd7e753-b687-49de-83ec-55f1fc315007","Type":"ContainerStarted","Data":"e32ccef4b4ad99c983c35cb09d81fbaae1652d42f470c84e049bc5564cbcdb24"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.708356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.708797 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.208779483 +0000 UTC m=+146.856191038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.744473 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-w529g" podStartSLOduration=125.744458409 podStartE2EDuration="2m5.744458409s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.742145962 +0000 UTC m=+146.389557517" watchObservedRunningTime="2025-10-07 17:05:42.744458409 +0000 UTC m=+146.391869964" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.771076 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.773463 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" event={"ID":"005cd89d-7e4e-4bed-aa11-e4d6f871710b","Type":"ContainerStarted","Data":"1ca13177773d7fe5e887c2d06d9b109d7c1558c876659b6e2c992dfb46678cfe"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.773494 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" event={"ID":"005cd89d-7e4e-4bed-aa11-e4d6f871710b","Type":"ContainerStarted","Data":"6587630e09fec4bb10f027e0b5e9284701f10fbe1f88b2aa0403e95f0be814c1"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.791819 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.796237 4681 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lwhwn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.796270 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" podUID="0abf116d-d4aa-4bb8-95ab-430c181d5bdf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.803400 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" event={"ID":"dd5794df-cde0-4881-921f-9ba7006d4281","Type":"ContainerStarted","Data":"082ac4c96e724e3e9982e21b5a72ceaf91c84ac2791117d3f8e2187f9f33516c"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.810504 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.811953 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.311917777 +0000 UTC m=+146.959329332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.827357 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" event={"ID":"d181d51e-2df2-4025-b7aa-282418e6c9da","Type":"ContainerStarted","Data":"4dd54c3b107dc5b143820cb8057b47d4ebbc66b7ee709417b7f2a79252632043"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.841056 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvzgt" event={"ID":"4ee9a95e-d102-45ea-a77b-711de5bd03a9","Type":"ContainerStarted","Data":"283ebd028c2fffa73e7ba869d52f14e6e7cf7b4012d40147165692f0ade2f460"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.862056 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:42 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:42 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:42 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.862106 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.875802 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcxsj" podStartSLOduration=125.875782031 podStartE2EDuration="2m5.875782031s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.871007282 +0000 UTC m=+146.518418837" watchObservedRunningTime="2025-10-07 17:05:42.875782031 +0000 UTC m=+146.523193586" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.898581 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" event={"ID":"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14","Type":"ContainerStarted","Data":"640a9d025c713ed70786a9a0c3799a8ea81a10ef4e3de7844dd1263e5539c89c"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.921157 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:42 crc kubenswrapper[4681]: E1007 17:05:42.922565 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.422550518 +0000 UTC m=+147.069962073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.927170 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" podStartSLOduration=125.927153311 podStartE2EDuration="2m5.927153311s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.926538874 +0000 UTC m=+146.573950439" watchObservedRunningTime="2025-10-07 17:05:42.927153311 +0000 UTC m=+146.574564866" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.951802 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6s884" podStartSLOduration=125.951786466 podStartE2EDuration="2m5.951786466s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:42.951469477 +0000 UTC m=+146.598881052" watchObservedRunningTime="2025-10-07 17:05:42.951786466 +0000 UTC m=+146.599198021" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.956924 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" event={"ID":"3017a611-cb0d-4f79-b6f8-2634dc026e2e","Type":"ContainerStarted","Data":"44acdbbea3948b6839c58ec5fc0996e1a023f9a869436f1cb6e8ffaa698efc87"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.983582 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" event={"ID":"77d83873-a7b2-42d5-a94e-d7bfc4784cba","Type":"ContainerStarted","Data":"54cbb4582a05b4316d43512568ac6059a23f3d2f68377a65f35dadc52f140753"} Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.984230 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.986617 4681 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bqww9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 07 17:05:42 crc kubenswrapper[4681]: I1007 17:05:42.986649 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" podUID="77d83873-a7b2-42d5-a94e-d7bfc4784cba" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.006200 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6zl9n" podStartSLOduration=126.006181506 podStartE2EDuration="2m6.006181506s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.005225397 +0000 UTC m=+146.652636952" watchObservedRunningTime="2025-10-07 17:05:43.006181506 +0000 UTC m=+146.653593061" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.016150 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" event={"ID":"0b45f921-126b-4677-889b-67da1ba1840e","Type":"ContainerStarted","Data":"53ae08c4acbfa82fdc20c04cd03a6799ac2ee12164349df94a97ce35a1871dcf"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.016477 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.022131 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.023024 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.522999534 +0000 UTC m=+147.170411089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.055153 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.055187 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" event={"ID":"b9b8f6ae-aa50-4e7a-a59e-359b18fada73","Type":"ContainerStarted","Data":"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.058384 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" event={"ID":"1746a4d0-d93b-40e4-bd79-a14425d1a9cc","Type":"ContainerStarted","Data":"0ec367253c6d73b3ec5dc2ed4fdc523dcd1cdd19618e9d50ee5bccc63c425367"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.075630 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" podStartSLOduration=127.07561403 podStartE2EDuration="2m7.07561403s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.073199031 +0000 UTC m=+146.720610586" watchObservedRunningTime="2025-10-07 17:05:43.07561403 +0000 UTC m=+146.723025575" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.084308 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" event={"ID":"26c685e0-8c5d-4fdc-a5f4-8b746d285813","Type":"ContainerStarted","Data":"18f493a0c2c0e3be05eaf2ac48058217734d6d93fd5e78f2f285ca61fdd2368a"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.099812 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gxzkf" event={"ID":"67274805-ff68-4381-b1b6-9a6fbf85aca5","Type":"ContainerStarted","Data":"ed82ec3efa0e65956678d1d26a1fe1bb97f1f4a5c2551f53995df89350741b12"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.105944 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" podStartSLOduration=126.10592676 podStartE2EDuration="2m6.10592676s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.104769787 +0000 UTC m=+146.752181342" watchObservedRunningTime="2025-10-07 17:05:43.10592676 +0000 UTC m=+146.753338315" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.123119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.124944 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.624925632 +0000 UTC m=+147.272337287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.129372 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-j8hqs" event={"ID":"c669d56a-7d2e-4161-ac70-29d72a747038","Type":"ContainerStarted","Data":"9556161e026113dc81777d6ec5821f3a398118fa61bddb22e77ee95e5c16bfc9"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.130237 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.138998 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.139040 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.160484 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-csgxx" event={"ID":"2fdbba2d-2b9f-47f3-a618-f1284f5bce5b","Type":"ContainerStarted","Data":"f9d210de046167d5754093e5bfb23cf92367b806807a931658f21e5d21f5eff4"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.164575 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.172084 4681 patch_prober.go:28] interesting pod/console-operator-58897d9998-csgxx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.172137 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-csgxx" podUID="2fdbba2d-2b9f-47f3-a618-f1284f5bce5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.185805 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" podStartSLOduration=126.185784778 podStartE2EDuration="2m6.185784778s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.141100751 +0000 UTC m=+146.788512326" watchObservedRunningTime="2025-10-07 17:05:43.185784778 +0000 UTC m=+146.833196323" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.203696 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.213786 4681 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hq69t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.213844 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.216618 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.224538 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.225778 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.725755118 +0000 UTC m=+147.373166673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.232004 4681 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-x8lr7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.232052 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" podUID="10e7236c-b3fa-4c0c-9d1f-068374e029cb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.235262 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" event={"ID":"9fd63e72-276a-43fe-8927-da5aba5b7a98","Type":"ContainerStarted","Data":"f3e97290630cb38b0a709f82d0601168b83582f49c92787ee826b4c525828484"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.237610 4681 generic.go:334] "Generic (PLEG): container finished" podID="a96ffd28-b774-40e1-ad52-e6fa63483f1d" containerID="e4ae34d8d439b062d0d5a53d14887a6b1a0afb5e5f767e1d060f81ce70e68a6a" exitCode=0 Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.237656 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" event={"ID":"a96ffd28-b774-40e1-ad52-e6fa63483f1d","Type":"ContainerDied","Data":"e4ae34d8d439b062d0d5a53d14887a6b1a0afb5e5f767e1d060f81ce70e68a6a"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.251449 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxqsd" event={"ID":"fb2b7d26-6891-4f5d-9765-5b8922abd5cd","Type":"ContainerStarted","Data":"11f0bc97e42c70482e3846b1ef3815a8bd3e4c47b23d278e5721a7917eada74d"} Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.252308 4681 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fst86 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.252353 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.270343 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-j8hqs" podStartSLOduration=127.270323041 podStartE2EDuration="2m7.270323041s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.255939074 +0000 UTC m=+146.903350649" watchObservedRunningTime="2025-10-07 17:05:43.270323041 +0000 UTC m=+146.917734596" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.271157 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" podStartSLOduration=126.271152065 podStartE2EDuration="2m6.271152065s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.184658755 +0000 UTC m=+146.832070310" watchObservedRunningTime="2025-10-07 17:05:43.271152065 +0000 UTC m=+146.918563620" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.299009 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.333987 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.341374 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.841355513 +0000 UTC m=+147.488767068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.353789 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-rwphq" podStartSLOduration=126.353761983 podStartE2EDuration="2m6.353761983s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.340159758 +0000 UTC m=+146.987571313" watchObservedRunningTime="2025-10-07 17:05:43.353761983 +0000 UTC m=+147.001173538" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.404720 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" podStartSLOduration=126.404698581 podStartE2EDuration="2m6.404698581s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.399916702 +0000 UTC m=+147.047328267" watchObservedRunningTime="2025-10-07 17:05:43.404698581 +0000 UTC m=+147.052110136" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.438082 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.438334 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.938310177 +0000 UTC m=+147.585721732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.438650 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.439041 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:43.939027237 +0000 UTC m=+147.586438802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.523957 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4g7gf" podStartSLOduration=126.523940202 podStartE2EDuration="2m6.523940202s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.438529003 +0000 UTC m=+147.085940558" watchObservedRunningTime="2025-10-07 17:05:43.523940202 +0000 UTC m=+147.171351757" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.539651 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.540025 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.040008448 +0000 UTC m=+147.687420003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.605841 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" podStartSLOduration=126.605825479 podStartE2EDuration="2m6.605825479s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.603273685 +0000 UTC m=+147.250685240" watchObservedRunningTime="2025-10-07 17:05:43.605825479 +0000 UTC m=+147.253237034" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.606520 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" podStartSLOduration=126.606515569 podStartE2EDuration="2m6.606515569s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.539815453 +0000 UTC m=+147.187227028" watchObservedRunningTime="2025-10-07 17:05:43.606515569 +0000 UTC m=+147.253927124" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.640556 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.640940 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.140923687 +0000 UTC m=+147.788335242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.741941 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.742112 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.242084963 +0000 UTC m=+147.889496508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.742554 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.742856 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.242845146 +0000 UTC m=+147.890256691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.831053 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" podStartSLOduration=126.831025004 podStartE2EDuration="2m6.831025004s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.729709644 +0000 UTC m=+147.377121209" watchObservedRunningTime="2025-10-07 17:05:43.831025004 +0000 UTC m=+147.478436559" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.843461 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.843953 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.343937419 +0000 UTC m=+147.991348974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.857899 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:43 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:43 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:43 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.858399 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:43 crc kubenswrapper[4681]: I1007 17:05:43.944955 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:43 crc kubenswrapper[4681]: E1007 17:05:43.945310 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.445296821 +0000 UTC m=+148.092708376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.046407 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.046701 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.546687103 +0000 UTC m=+148.194098658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.092632 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-csgxx" podStartSLOduration=128.092616846 podStartE2EDuration="2m8.092616846s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:43.972616344 +0000 UTC m=+147.620027899" watchObservedRunningTime="2025-10-07 17:05:44.092616846 +0000 UTC m=+147.740028401" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.148192 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.148581 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.64856417 +0000 UTC m=+148.295975725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.179000 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5n46l" podStartSLOduration=8.178979713 podStartE2EDuration="8.178979713s" podCreationTimestamp="2025-10-07 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.16956756 +0000 UTC m=+147.816979115" watchObservedRunningTime="2025-10-07 17:05:44.178979713 +0000 UTC m=+147.826391268" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.248867 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.249049 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.749017685 +0000 UTC m=+148.396429250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.249229 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.249500 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.749488239 +0000 UTC m=+148.396899794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.251885 4681 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tw9ww container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.37:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.251915 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.37:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.257912 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" event={"ID":"1af973a2-4ae5-4a2e-9d36-941f8689054b","Type":"ContainerStarted","Data":"0fc1f733a69a9f50231ddf8ef6c05fb00da801a36349bc7e901d26a432cf8a46"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.259241 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ns9f4" event={"ID":"60fcc2ef-a564-4e1e-947d-1dd672c4ced7","Type":"ContainerStarted","Data":"d1a09baf314e519df0cc92ab8cccf549850fc971e4e4e7c5d4d7618250abd2b3"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.260859 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" event={"ID":"0b45f921-126b-4677-889b-67da1ba1840e","Type":"ContainerStarted","Data":"74b5797746efac5734ea6afd705a6c86b20c13c1b31e75a75fe65f4a06f767bd"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.262611 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" event={"ID":"d38a6e43-b412-4c01-bf04-1036f9d6942c","Type":"ContainerStarted","Data":"aab15ea6b50430dc23bc47af74f1dccb090886ea349a507e8baacaed3748c608"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.262670 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" event={"ID":"d38a6e43-b412-4c01-bf04-1036f9d6942c","Type":"ContainerStarted","Data":"736fdf7fc6faae0a8a4768e8a26ff6d0bebe2dad8bda92cbb0dd8e68d84febb9"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.264082 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lq6nd" podStartSLOduration=127.264071623 podStartE2EDuration="2m7.264071623s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.262673832 +0000 UTC m=+147.910085387" watchObservedRunningTime="2025-10-07 17:05:44.264071623 +0000 UTC m=+147.911483178" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.266983 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" event={"ID":"0abf116d-d4aa-4bb8-95ab-430c181d5bdf","Type":"ContainerStarted","Data":"4204eea614bbc166fe2ffe4b971aa3942aa00f5bb6d72e22c9065d0dd3bb9dfc"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.270010 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjns4" event={"ID":"9fd63e72-276a-43fe-8927-da5aba5b7a98","Type":"ContainerStarted","Data":"1e68848ed0ace868211e3a4ebf7edfc69da76d62b4063c61ac296fac374e10d9"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.271575 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5n46l" event={"ID":"365d94e6-7b9f-49bb-92db-f5a232406e10","Type":"ContainerStarted","Data":"f31538300ca6a25f6a407ae97effca1261c67f6bd9c268bb6d0a26d6ff049210"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.273555 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" event={"ID":"61783842-70fb-40b1-bc57-f614ca527168","Type":"ContainerStarted","Data":"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.274310 4681 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hq69t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.274335 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.275489 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" event={"ID":"10e7236c-b3fa-4c0c-9d1f-068374e029cb","Type":"ContainerStarted","Data":"fd5e4f60d60da5969058dbfb7703b9cada8c1136c4f99e9d66675d89951768fd"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.278527 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" event={"ID":"d712e0bd-952b-4cba-8d92-2c6e72f6b867","Type":"ContainerStarted","Data":"f9aeb2e3cf70a3afaa894276abbeedcfd155c895d825b280df62f073968101ae"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.284064 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" event={"ID":"7a79ff15-67a5-43e6-a92d-84c3168db81b","Type":"ContainerStarted","Data":"bde7ffcc0233768bf02e7e93c7022a1f68354c86aa056c74bde7c74219ee5289"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.286224 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" event={"ID":"a96ffd28-b774-40e1-ad52-e6fa63483f1d","Type":"ContainerStarted","Data":"467f7ff0eaea3143734d0c886facab3c026bb46f9735c356e4d3a66ece627636"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.286772 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.288389 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dvzgt" event={"ID":"4ee9a95e-d102-45ea-a77b-711de5bd03a9","Type":"ContainerStarted","Data":"ccd621302e82538793fade8e33f1838b0c3c747119ce9e18ac31bd54e0225c21"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.288841 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.292146 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" event={"ID":"77548ba9-d52a-4585-984e-e08c45a58aec","Type":"ContainerStarted","Data":"9b337d41c3a87205c22c925b83877744b89264bf2f7e74ce3dd13ebf6a4f6c8c"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.295206 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.295248 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.296074 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9m58d" event={"ID":"2fb24a65-5ed4-42a8-9e29-c8bcd1e0de14","Type":"ContainerStarted","Data":"c3d7cdea1b33f0e49dc438fadfdae048825c36224dee32a4b674755e540c1c2c"} Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.296667 4681 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bqww9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.296858 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" podUID="77d83873-a7b2-42d5-a94e-d7bfc4784cba" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.336007 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.339525 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" podStartSLOduration=127.339509362 podStartE2EDuration="2m7.339509362s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.318300476 +0000 UTC m=+147.965712031" watchObservedRunningTime="2025-10-07 17:05:44.339509362 +0000 UTC m=+147.986920917" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.341390 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hxqsd" podStartSLOduration=8.341383336 podStartE2EDuration="8.341383336s" podCreationTimestamp="2025-10-07 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.338182663 +0000 UTC m=+147.985594218" watchObservedRunningTime="2025-10-07 17:05:44.341383336 +0000 UTC m=+147.988794891" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.350564 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.350759 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.850724138 +0000 UTC m=+148.498135693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.350911 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.352048 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.852035785 +0000 UTC m=+148.499447340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.366574 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" podStartSLOduration=128.366553957 podStartE2EDuration="2m8.366553957s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.365095745 +0000 UTC m=+148.012507290" watchObservedRunningTime="2025-10-07 17:05:44.366553957 +0000 UTC m=+148.013965512" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.370411 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x8lr7" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.450289 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.455406 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.455814 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:44.955798327 +0000 UTC m=+148.603209882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.464994 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-w6vtn" podStartSLOduration=127.464978273 podStartE2EDuration="2m7.464978273s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.464194561 +0000 UTC m=+148.111606116" watchObservedRunningTime="2025-10-07 17:05:44.464978273 +0000 UTC m=+148.112389828" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.518072 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dvzgt" podStartSLOduration=8.518053913 podStartE2EDuration="8.518053913s" podCreationTimestamp="2025-10-07 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.515896652 +0000 UTC m=+148.163308217" watchObservedRunningTime="2025-10-07 17:05:44.518053913 +0000 UTC m=+148.165465468" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.556824 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.557275 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.057256891 +0000 UTC m=+148.704668486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.561820 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" podStartSLOduration=128.561795043 podStartE2EDuration="2m8.561795043s" podCreationTimestamp="2025-10-07 17:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.558004634 +0000 UTC m=+148.205416189" watchObservedRunningTime="2025-10-07 17:05:44.561795043 +0000 UTC m=+148.209206598" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.596965 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" podStartSLOduration=127.596947293 podStartE2EDuration="2m7.596947293s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:44.593224975 +0000 UTC m=+148.240636540" watchObservedRunningTime="2025-10-07 17:05:44.596947293 +0000 UTC m=+148.244358848" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.658452 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.658631 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.158606423 +0000 UTC m=+148.806017978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.659016 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.659354 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.159342675 +0000 UTC m=+148.806754220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.760500 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.760860 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.26084465 +0000 UTC m=+148.908256195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.854222 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:44 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:44 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:44 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.854283 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.862320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.862715 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.362699596 +0000 UTC m=+149.010111151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.963839 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:44 crc kubenswrapper[4681]: E1007 17:05:44.964484 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.46445568 +0000 UTC m=+149.111867235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.964672 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.964760 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.965849 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.965963 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.971069 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:44 crc kubenswrapper[4681]: I1007 17:05:44.990615 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.010729 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.020586 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.048105 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.054211 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.061238 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.067776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.068216 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.56818938 +0000 UTC m=+149.215600935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.169066 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.169229 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.669202482 +0000 UTC m=+149.316614037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.169349 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.169736 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.669725307 +0000 UTC m=+149.317136932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.268553 4681 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lwhwn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.268968 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" podUID="0abf116d-d4aa-4bb8-95ab-430c181d5bdf" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.270208 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.270491 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.770471851 +0000 UTC m=+149.417883406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.299617 4681 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hq69t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.299653 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.300292 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.300325 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.300395 4681 patch_prober.go:28] interesting pod/console-operator-58897d9998-csgxx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.300413 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-csgxx" podUID="2fdbba2d-2b9f-47f3-a618-f1284f5bce5b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.372357 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.374537 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.874525731 +0000 UTC m=+149.521937286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.484530 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.484896 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:45.984860003 +0000 UTC m=+149.632271558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.589482 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.589871 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.08985332 +0000 UTC m=+149.737264865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.690403 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.690717 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.190701877 +0000 UTC m=+149.838113432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.792266 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.792661 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.292645386 +0000 UTC m=+149.940056951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.868439 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:45 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:45 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:45 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.868506 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.897143 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:45 crc kubenswrapper[4681]: E1007 17:05:45.897484 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.397468598 +0000 UTC m=+150.044880153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:45 crc kubenswrapper[4681]: I1007 17:05:45.942106 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lwhwn" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.000073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.000401 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.500390105 +0000 UTC m=+150.147801670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.102018 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.103271 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.60325574 +0000 UTC m=+150.250667295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.204800 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.205861 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.705849048 +0000 UTC m=+150.353260603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.214273 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-csgxx" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.291580 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.292559 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.305253 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.307315 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.307673 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.807657483 +0000 UTC m=+150.455069038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.329241 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.333603 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"770b32a5d5981a3eefbf8b5117574223c6a85109170bd9b9a2a6ab96539f6a62"} Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.339206 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" event={"ID":"7a79ff15-67a5-43e6-a92d-84c3168db81b","Type":"ContainerStarted","Data":"4039668ce58509110c930d107eeecb66dd0ccb1fdf3821933b50e35eb867ba60"} Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.347904 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"01e79477f0946d7d2d89e573d9c4f357eaa8f8efcefbb2cd3541879ee4f49aa4"} Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.359005 4681 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-hgrnc container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.359064 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" podUID="a96ffd28-b774-40e1-ad52-e6fa63483f1d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.408557 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlw5n\" (UniqueName: \"kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.408664 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.408751 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.408788 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.410092 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:46.910075336 +0000 UTC m=+150.557486891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.481868 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.482771 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.487220 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.509612 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.509781 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.009754708 +0000 UTC m=+150.657166263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510170 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlw5n\" (UniqueName: \"kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510258 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7psn5\" (UniqueName: \"kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510292 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510332 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510437 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.510544 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.510560 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.010553142 +0000 UTC m=+150.657964687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.511128 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.511141 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.540754 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:05:46 crc kubenswrapper[4681]: W1007 17:05:46.544709 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-725e7cc2a9be60308a834bd1b3fc9ebc97b9119e1e48165918e81102bb2017ac WatchSource:0}: Error finding container 725e7cc2a9be60308a834bd1b3fc9ebc97b9119e1e48165918e81102bb2017ac: Status 404 returned error can't find the container with id 725e7cc2a9be60308a834bd1b3fc9ebc97b9119e1e48165918e81102bb2017ac Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.610956 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlw5n\" (UniqueName: \"kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n\") pod \"community-operators-4mrgc\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.611585 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.611815 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7psn5\" (UniqueName: \"kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.611848 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.611889 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.612215 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.612280 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.112267184 +0000 UTC m=+150.759678739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.612745 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.648066 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.652595 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7psn5\" (UniqueName: \"kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5\") pod \"certified-operators-svcfw\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.664633 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.666732 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.714947 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.715000 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.715071 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7d4\" (UniqueName: \"kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.715104 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.715501 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.215484959 +0000 UTC m=+150.862896514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.750386 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.795623 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.816362 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.816993 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.817046 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.317019486 +0000 UTC m=+150.964431041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.817088 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.817137 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.817283 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7d4\" (UniqueName: \"kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.817473 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.817766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.817796 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.317788558 +0000 UTC m=+150.965200113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.860065 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:46 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:46 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:46 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.860111 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.909379 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.910419 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.918054 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.918302 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.418269194 +0000 UTC m=+151.065680739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.918753 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:46 crc kubenswrapper[4681]: E1007 17:05:46.919262 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.419248692 +0000 UTC m=+151.066660237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.920853 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7d4\" (UniqueName: \"kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4\") pod \"community-operators-cvgwt\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:46 crc kubenswrapper[4681]: I1007 17:05:46.977497 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.007819 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.025826 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.026203 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.026228 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqll\" (UniqueName: \"kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.026273 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.026404 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.526389643 +0000 UTC m=+151.173801188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.129630 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.129682 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.129729 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.129746 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqll\" (UniqueName: \"kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.130294 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.130519 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.630506984 +0000 UTC m=+151.277918539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.130770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.183535 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqll\" (UniqueName: \"kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll\") pod \"certified-operators-mj5z5\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.231824 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.232079 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.732064501 +0000 UTC m=+151.379476056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.247282 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.334277 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.334960 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.834945708 +0000 UTC m=+151.482357253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.390351 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"025a05a940fb3b2af96dda6cda0806fb103050574359a0d3914a411b83adfdb7"} Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.390391 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"725e7cc2a9be60308a834bd1b3fc9ebc97b9119e1e48165918e81102bb2017ac"} Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.412869 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" event={"ID":"7a79ff15-67a5-43e6-a92d-84c3168db81b","Type":"ContainerStarted","Data":"23733b4eb32a4ec1acbc5446093bb1aa23c1f2395c25c58b4be0bbbb97600a91"} Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.424345 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"44d41d569d516b00f5830427afa9289ce3e9eba3b7ca033654ca2383c2fbd1a4"} Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.436948 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.437814 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:47.937799003 +0000 UTC m=+151.585210558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.483865 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1dd0688340f5496e131dd6deea72617e8f83600db3e282b51fe9ad778d3a86aa"} Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.484567 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.540191 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.540778 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.040767191 +0000 UTC m=+151.688178746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.642041 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.642371 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.14235444 +0000 UTC m=+151.789765995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.744594 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.744955 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.244927696 +0000 UTC m=+151.892339251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.833830 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.856631 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.857221 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.357201534 +0000 UTC m=+152.004613089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.883649 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:47 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:47 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:47 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.883690 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.957776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:47 crc kubenswrapper[4681]: E1007 17:05:47.958307 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.458293908 +0000 UTC m=+152.105705473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:47 crc kubenswrapper[4681]: I1007 17:05:47.968847 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hgrnc" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.003150 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.063181 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.064259 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.564237493 +0000 UTC m=+152.211649048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.175854 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.176541 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.676526392 +0000 UTC m=+152.323937947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.201333 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.201376 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.210333 4681 patch_prober.go:28] interesting pod/console-f9d7485db-nds8d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.210382 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nds8d" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.243529 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.243576 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.273351 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.278129 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.279244 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.779218733 +0000 UTC m=+152.426630288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.332892 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.353724 4681 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rgk2c container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]log ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]etcd ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/max-in-flight-filter ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 07 17:05:48 crc kubenswrapper[4681]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 07 17:05:48 crc kubenswrapper[4681]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/project.openshift.io-projectcache ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-startinformers ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 07 17:05:48 crc kubenswrapper[4681]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 17:05:48 crc kubenswrapper[4681]: livez check failed Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.353790 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" podUID="77548ba9-d52a-4585-984e-e08c45a58aec" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.385531 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.386787 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.886769714 +0000 UTC m=+152.534181369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.448129 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.448366 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.456066 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.456489 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.456505 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.456542 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.456548 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.457533 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.466762 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.479046 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.483094 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.487535 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.488005 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:48.987987921 +0000 UTC m=+152.635399476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.537565 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerStarted","Data":"20cb1d15fe54a02f9ef0be57f0f914459cf0dc514c8f9beb3ed9f86ec7a11e7d"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.537608 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerStarted","Data":"95c86c1191ef5e01212c04203e4320d28e7ea09ae385b5fb3dcff574726791fb"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.550479 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerStarted","Data":"55bfe034e893a89a1c377b29c55f9249eccacca3865af67bc53ab115d7b480e4"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.564305 4681 generic.go:334] "Generic (PLEG): container finished" podID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerID="3351bbf0473ce77bdf501af4ba26589490a1caba900809edc6a0d7a99b32754e" exitCode=0 Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.565078 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerDied","Data":"3351bbf0473ce77bdf501af4ba26589490a1caba900809edc6a0d7a99b32754e"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.565107 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerStarted","Data":"ec0936ae1068a70aca9997ea6801b5a4d6464ae2be034d86021d1d1609bea179"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.567601 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.580007 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" event={"ID":"7a79ff15-67a5-43e6-a92d-84c3168db81b","Type":"ContainerStarted","Data":"32a3cfa5c0b8aae415e4f5bb9410f706ad36f99df4603f01509570139ee3afc9"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.583411 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerStarted","Data":"9e112731647abbde06dfff824c0753a337973eb6f7636f07750f88d6c1e8557e"} Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.590363 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.590402 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.590427 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmd5p\" (UniqueName: \"kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.590537 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.591707 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.091691381 +0000 UTC m=+152.739102936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.594821 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6jzt8" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.692388 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.693176 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.693349 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.693426 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmd5p\" (UniqueName: \"kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.693895 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.193861347 +0000 UTC m=+152.841272902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.695860 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.706226 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.736547 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmd5p\" (UniqueName: \"kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p\") pod \"redhat-marketplace-64mxk\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.795899 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.796646 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.796969 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.296956469 +0000 UTC m=+152.944368024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.840227 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-m8gtz" podStartSLOduration=12.840209404 podStartE2EDuration="12.840209404s" podCreationTimestamp="2025-10-07 17:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:48.837360451 +0000 UTC m=+152.484772006" watchObservedRunningTime="2025-10-07 17:05:48.840209404 +0000 UTC m=+152.487620959" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.852835 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.854446 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.859567 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:48 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:48 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:48 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.859596 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.859618 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.884102 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.898959 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.899125 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.399106463 +0000 UTC m=+153.046518028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.899204 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:48 crc kubenswrapper[4681]: E1007 17:05:48.899556 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.399549166 +0000 UTC m=+153.046960711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:48 crc kubenswrapper[4681]: I1007 17:05:48.964445 4681 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.000003 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.000242 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.000326 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.000354 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvvcz\" (UniqueName: \"kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: E1007 17:05:49.001155 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.501139135 +0000 UTC m=+153.148550690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.101490 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.101754 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.101773 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvvcz\" (UniqueName: \"kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.101828 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.102208 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.102403 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: E1007 17:05:49.102619 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.60260906 +0000 UTC m=+153.250020615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.137678 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvvcz\" (UniqueName: \"kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz\") pod \"redhat-marketplace-jnv6r\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.188011 4681 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T17:05:48.964471801Z","Handler":null,"Name":""} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.203054 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:49 crc kubenswrapper[4681]: E1007 17:05:49.203260 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.70322824 +0000 UTC m=+153.350639795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.205006 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: E1007 17:05:49.205301 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 17:05:49.705288249 +0000 UTC m=+153.352699804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vr5kp" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.212677 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.233140 4681 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.233191 4681 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.306425 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.385555 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.407620 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.442831 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.454656 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.455651 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.456388 4681 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.456425 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.465403 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.480520 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.510298 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.510343 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.510363 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2q4\" (UniqueName: \"kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.518171 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bqww9" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.600605 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.602243 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vr5kp\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.607230 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.614582 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.614781 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.614899 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2q4\" (UniqueName: \"kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.616105 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.616645 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.624424 4681 generic.go:334] "Generic (PLEG): container finished" podID="abc28b46-2a9f-4141-8e65-a9c956e0f261" containerID="86c8f25c68b4cd646ad8682cad9f74bc3d777bdb6ed7b62f93e0ab1890d5d373" exitCode=0 Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.624671 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" event={"ID":"abc28b46-2a9f-4141-8e65-a9c956e0f261","Type":"ContainerDied","Data":"86c8f25c68b4cd646ad8682cad9f74bc3d777bdb6ed7b62f93e0ab1890d5d373"} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.654606 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2q4\" (UniqueName: \"kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4\") pod \"redhat-operators-8lxg2\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.663807 4681 generic.go:334] "Generic (PLEG): container finished" podID="2ecebb16-848c-4597-8f86-7779f4c82530" containerID="3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5" exitCode=0 Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.665206 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerDied","Data":"3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5"} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.697500 4681 generic.go:334] "Generic (PLEG): container finished" podID="390445e9-214f-423d-b39d-9411ca5cf099" containerID="20cb1d15fe54a02f9ef0be57f0f914459cf0dc514c8f9beb3ed9f86ec7a11e7d" exitCode=0 Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.697869 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerDied","Data":"20cb1d15fe54a02f9ef0be57f0f914459cf0dc514c8f9beb3ed9f86ec7a11e7d"} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.725577 4681 generic.go:334] "Generic (PLEG): container finished" podID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerID="c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38" exitCode=0 Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.725764 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerDied","Data":"c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38"} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.733806 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerStarted","Data":"bd250fbc9ef6a2bdb50bf5820df03781d4d76e7e81ced7222a5c312f2d304ae9"} Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.752182 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.780677 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.850218 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.851444 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.872071 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:49 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:49 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:49 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.872402 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.875494 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.926556 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.926627 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqzlc\" (UniqueName: \"kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:49 crc kubenswrapper[4681]: I1007 17:05:49.926675 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.029760 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.030336 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqzlc\" (UniqueName: \"kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.031239 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.032519 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.032706 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.033209 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.056287 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqzlc\" (UniqueName: \"kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc\") pod \"redhat-operators-f9d22\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.115322 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:05:50 crc kubenswrapper[4681]: W1007 17:05:50.144136 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab9b71a_d59d_436b_8c0b_c62801ea9326.slice/crio-e326d2f00b7e4f8e427b18edc5940ba09611546146b5aa85fd3fa7cd55d54b40 WatchSource:0}: Error finding container e326d2f00b7e4f8e427b18edc5940ba09611546146b5aa85fd3fa7cd55d54b40: Status 404 returned error can't find the container with id e326d2f00b7e4f8e427b18edc5940ba09611546146b5aa85fd3fa7cd55d54b40 Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.187948 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.239120 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.240160 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.242612 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.243795 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.251481 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.334420 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.334858 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.436007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.436083 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.436144 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.456625 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.537258 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.609674 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.765493 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerID="3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5" exitCode=0 Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.765780 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerDied","Data":"3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.765821 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerStarted","Data":"1a02f645c00d1a01d061655fb63f45a3c8941044064ca9950968a58377feac90"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.779965 4681 generic.go:334] "Generic (PLEG): container finished" podID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerID="5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a" exitCode=0 Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.780080 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerDied","Data":"5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.787667 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" event={"ID":"66e9eba2-1514-42a7-b14b-802c380cc3b3","Type":"ContainerStarted","Data":"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.787705 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" event={"ID":"66e9eba2-1514-42a7-b14b-802c380cc3b3","Type":"ContainerStarted","Data":"09309c0e5ea1d03a85c2e2ad6a645e3c4c0523c02dc0e1fdd31ea56f70c8eaeb"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.788304 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.794062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerStarted","Data":"05350b882f1004cbff3e3215251ec9af8eae2b7e5ed57571c74f22947a733dd1"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.800157 4681 generic.go:334] "Generic (PLEG): container finished" podID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerID="46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7" exitCode=0 Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.800927 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerDied","Data":"46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.800967 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerStarted","Data":"e326d2f00b7e4f8e427b18edc5940ba09611546146b5aa85fd3fa7cd55d54b40"} Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.867517 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:50 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:50 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:50 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.867557 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:50 crc kubenswrapper[4681]: I1007 17:05:50.929317 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" podStartSLOduration=133.929285384 podStartE2EDuration="2m13.929285384s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:50.893010351 +0000 UTC m=+154.540421916" watchObservedRunningTime="2025-10-07 17:05:50.929285384 +0000 UTC m=+154.576696939" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.071358 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.081488 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.082131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.089818 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.097201 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.158751 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.158828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.167264 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.260490 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.260560 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.261290 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.289706 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.412078 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.419372 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.462868 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsljs\" (UniqueName: \"kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs\") pod \"abc28b46-2a9f-4141-8e65-a9c956e0f261\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.462974 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") pod \"abc28b46-2a9f-4141-8e65-a9c956e0f261\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.463074 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume\") pod \"abc28b46-2a9f-4141-8e65-a9c956e0f261\" (UID: \"abc28b46-2a9f-4141-8e65-a9c956e0f261\") " Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.465000 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume" (OuterVolumeSpecName: "config-volume") pod "abc28b46-2a9f-4141-8e65-a9c956e0f261" (UID: "abc28b46-2a9f-4141-8e65-a9c956e0f261"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.467317 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "abc28b46-2a9f-4141-8e65-a9c956e0f261" (UID: "abc28b46-2a9f-4141-8e65-a9c956e0f261"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.468653 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs" (OuterVolumeSpecName: "kube-api-access-lsljs") pod "abc28b46-2a9f-4141-8e65-a9c956e0f261" (UID: "abc28b46-2a9f-4141-8e65-a9c956e0f261"). InnerVolumeSpecName "kube-api-access-lsljs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.564765 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsljs\" (UniqueName: \"kubernetes.io/projected/abc28b46-2a9f-4141-8e65-a9c956e0f261-kube-api-access-lsljs\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.564796 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abc28b46-2a9f-4141-8e65-a9c956e0f261-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.564805 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abc28b46-2a9f-4141-8e65-a9c956e0f261-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.589009 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.821568 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.821621 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5" event={"ID":"abc28b46-2a9f-4141-8e65-a9c956e0f261","Type":"ContainerDied","Data":"c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d"} Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.821656 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c538216c8262c04f9d2196c3ac5e54448ba8a1cfa3c4b5e969ede490fad1d20d" Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.826582 4681 generic.go:334] "Generic (PLEG): container finished" podID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerID="11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1" exitCode=0 Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.826678 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerDied","Data":"11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1"} Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.840388 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41740c78-48da-45f3-9b5a-dd196f55ad8f","Type":"ContainerStarted","Data":"26822697d211d0a1775ac826180ce8739b6122dec92da442780749cb610b4a70"} Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.857191 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:51 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:51 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:51 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:51 crc kubenswrapper[4681]: I1007 17:05:51.857264 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:52 crc kubenswrapper[4681]: I1007 17:05:52.211308 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 17:05:52 crc kubenswrapper[4681]: I1007 17:05:52.870198 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:52 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:52 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:52 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:52 crc kubenswrapper[4681]: I1007 17:05:52.870886 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:52 crc kubenswrapper[4681]: I1007 17:05:52.878183 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3","Type":"ContainerStarted","Data":"1d5a9bb88ad25220432506e243a4665abf65c15f5b5307193e90a3391a7c244e"} Oct 07 17:05:52 crc kubenswrapper[4681]: I1007 17:05:52.904953 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41740c78-48da-45f3-9b5a-dd196f55ad8f","Type":"ContainerStarted","Data":"8148d6347d0d446974b6b126404332bf895cc712d0b7eea4c79e477c3c6d8e07"} Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.247951 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.254120 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rgk2c" Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.282912 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.2828946119999998 podStartE2EDuration="3.282894612s" podCreationTimestamp="2025-10-07 17:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:05:52.92829455 +0000 UTC m=+156.575706105" watchObservedRunningTime="2025-10-07 17:05:53.282894612 +0000 UTC m=+156.930306167" Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.855577 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:53 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:53 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:53 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.855890 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.926764 4681 generic.go:334] "Generic (PLEG): container finished" podID="41740c78-48da-45f3-9b5a-dd196f55ad8f" containerID="8148d6347d0d446974b6b126404332bf895cc712d0b7eea4c79e477c3c6d8e07" exitCode=0 Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.926828 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41740c78-48da-45f3-9b5a-dd196f55ad8f","Type":"ContainerDied","Data":"8148d6347d0d446974b6b126404332bf895cc712d0b7eea4c79e477c3c6d8e07"} Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.930272 4681 generic.go:334] "Generic (PLEG): container finished" podID="000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" containerID="544a504abb25f702ea1391fb7b5a28aa76e7b760037e08a25e6d1c24175e8ece" exitCode=0 Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.930993 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3","Type":"ContainerDied","Data":"544a504abb25f702ea1391fb7b5a28aa76e7b760037e08a25e6d1c24175e8ece"} Oct 07 17:05:53 crc kubenswrapper[4681]: I1007 17:05:53.942012 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dvzgt" Oct 07 17:05:54 crc kubenswrapper[4681]: I1007 17:05:54.855406 4681 patch_prober.go:28] interesting pod/router-default-5444994796-hcltk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 17:05:54 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Oct 07 17:05:54 crc kubenswrapper[4681]: [+]process-running ok Oct 07 17:05:54 crc kubenswrapper[4681]: healthz check failed Oct 07 17:05:54 crc kubenswrapper[4681]: I1007 17:05:54.855491 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hcltk" podUID="1ae45842-477f-4cc6-9ff7-6c38b866e8f9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.358979 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.426343 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access\") pod \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.427090 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir\") pod \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\" (UID: \"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3\") " Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.427213 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" (UID: "000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.427432 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.446004 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" (UID: "000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.497674 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.528068 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access\") pod \"41740c78-48da-45f3-9b5a-dd196f55ad8f\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.528190 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir\") pod \"41740c78-48da-45f3-9b5a-dd196f55ad8f\" (UID: \"41740c78-48da-45f3-9b5a-dd196f55ad8f\") " Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.528475 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.528524 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41740c78-48da-45f3-9b5a-dd196f55ad8f" (UID: "41740c78-48da-45f3-9b5a-dd196f55ad8f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.534308 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41740c78-48da-45f3-9b5a-dd196f55ad8f" (UID: "41740c78-48da-45f3-9b5a-dd196f55ad8f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.630245 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41740c78-48da-45f3-9b5a-dd196f55ad8f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.630532 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41740c78-48da-45f3-9b5a-dd196f55ad8f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.856091 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:55 crc kubenswrapper[4681]: I1007 17:05:55.862434 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hcltk" Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.024162 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.025580 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3","Type":"ContainerDied","Data":"1d5a9bb88ad25220432506e243a4665abf65c15f5b5307193e90a3391a7c244e"} Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.025727 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5a9bb88ad25220432506e243a4665abf65c15f5b5307193e90a3391a7c244e" Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.030178 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.030304 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41740c78-48da-45f3-9b5a-dd196f55ad8f","Type":"ContainerDied","Data":"26822697d211d0a1775ac826180ce8739b6122dec92da442780749cb610b4a70"} Oct 07 17:05:56 crc kubenswrapper[4681]: I1007 17:05:56.030351 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26822697d211d0a1775ac826180ce8739b6122dec92da442780749cb610b4a70" Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.233339 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.237967 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.456714 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.456716 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-j8hqs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.456764 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:58 crc kubenswrapper[4681]: I1007 17:05:58.456788 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-j8hqs" podUID="c669d56a-7d2e-4161-ac70-29d72a747038" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Oct 07 17:05:59 crc kubenswrapper[4681]: I1007 17:05:59.398931 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:59 crc kubenswrapper[4681]: I1007 17:05:59.417701 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/35b1b84e-518a-4567-8ad9-0e717e9958fb-metrics-certs\") pod \"network-metrics-daemon-xjf9z\" (UID: \"35b1b84e-518a-4567-8ad9-0e717e9958fb\") " pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:05:59 crc kubenswrapper[4681]: I1007 17:05:59.540620 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xjf9z" Oct 07 17:06:02 crc kubenswrapper[4681]: I1007 17:06:02.159370 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-dbxkf_3017a611-cb0d-4f79-b6f8-2634dc026e2e/cluster-samples-operator/0.log" Oct 07 17:06:02 crc kubenswrapper[4681]: I1007 17:06:02.159794 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" event={"ID":"3017a611-cb0d-4f79-b6f8-2634dc026e2e","Type":"ContainerDied","Data":"f94585d3de9fff5925815482b16bba316ef1daf4274167f4929a9a783ba331d7"} Oct 07 17:06:02 crc kubenswrapper[4681]: I1007 17:06:02.159805 4681 generic.go:334] "Generic (PLEG): container finished" podID="3017a611-cb0d-4f79-b6f8-2634dc026e2e" containerID="f94585d3de9fff5925815482b16bba316ef1daf4274167f4929a9a783ba331d7" exitCode=2 Oct 07 17:06:02 crc kubenswrapper[4681]: I1007 17:06:02.160283 4681 scope.go:117] "RemoveContainer" containerID="f94585d3de9fff5925815482b16bba316ef1daf4274167f4929a9a783ba331d7" Oct 07 17:06:08 crc kubenswrapper[4681]: I1007 17:06:08.484969 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-j8hqs" Oct 07 17:06:09 crc kubenswrapper[4681]: I1007 17:06:09.616633 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:06:12 crc kubenswrapper[4681]: I1007 17:06:12.195208 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:06:12 crc kubenswrapper[4681]: I1007 17:06:12.195550 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:06:19 crc kubenswrapper[4681]: I1007 17:06:19.586955 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7vd82" Oct 07 17:06:24 crc kubenswrapper[4681]: E1007 17:06:24.566513 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 17:06:24 crc kubenswrapper[4681]: E1007 17:06:24.567368 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wqll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mj5z5_openshift-marketplace(2ecebb16-848c-4597-8f86-7779f4c82530): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:24 crc kubenswrapper[4681]: E1007 17:06:24.568641 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mj5z5" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" Oct 07 17:06:25 crc kubenswrapper[4681]: I1007 17:06:25.071380 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 17:06:25 crc kubenswrapper[4681]: E1007 17:06:25.868664 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mj5z5" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" Oct 07 17:06:25 crc kubenswrapper[4681]: E1007 17:06:25.979163 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 17:06:25 crc kubenswrapper[4681]: E1007 17:06:25.979326 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlw5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4mrgc_openshift-marketplace(29ae017e-1bbd-4cf3-bda3-5fd9a25866c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:25 crc kubenswrapper[4681]: E1007 17:06:25.980696 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4mrgc" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" Oct 07 17:06:28 crc kubenswrapper[4681]: E1007 17:06:28.758566 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4mrgc" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" Oct 07 17:06:28 crc kubenswrapper[4681]: E1007 17:06:28.816967 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 17:06:28 crc kubenswrapper[4681]: E1007 17:06:28.817346 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs7d4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cvgwt_openshift-marketplace(c8118bad-bdb8-4a82-aef8-a70d685fe13a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:28 crc kubenswrapper[4681]: E1007 17:06:28.821014 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cvgwt" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" Oct 07 17:06:34 crc kubenswrapper[4681]: E1007 17:06:34.984735 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cvgwt" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.050426 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.050599 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqzlc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-f9d22_openshift-marketplace(eda3a2b0-8a17-40b1-b463-7b98159360db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.051969 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-f9d22" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.772942 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.773112 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7psn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-svcfw_openshift-marketplace(390445e9-214f-423d-b39d-9411ca5cf099): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:36 crc kubenswrapper[4681]: E1007 17:06:36.774274 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-svcfw" podUID="390445e9-214f-423d-b39d-9411ca5cf099" Oct 07 17:06:37 crc kubenswrapper[4681]: E1007 17:06:37.052178 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-f9d22" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" Oct 07 17:06:37 crc kubenswrapper[4681]: E1007 17:06:37.370228 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-svcfw" podUID="390445e9-214f-423d-b39d-9411ca5cf099" Oct 07 17:06:37 crc kubenswrapper[4681]: I1007 17:06:37.447765 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xjf9z"] Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.195631 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.196130 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm2q4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8lxg2_openshift-marketplace(eab9b71a-d59d-436b-8c0b-c62801ea9326): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.197489 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8lxg2" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.269460 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.269633 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rmd5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-64mxk_openshift-marketplace(dbfe57d1-0360-4f50-b36c-cc80a36f868e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.271054 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-64mxk" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.359435 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.359578 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvvcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jnv6r_openshift-marketplace(fb8f4ff7-5170-425e-855c-0684f4bdf34b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.361209 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jnv6r" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.375647 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" event={"ID":"35b1b84e-518a-4567-8ad9-0e717e9958fb","Type":"ContainerStarted","Data":"0dbc78c685f33f180340794a8ea44f2c332dc6f3450916fc8229472a6a471ed1"} Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.375694 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" event={"ID":"35b1b84e-518a-4567-8ad9-0e717e9958fb","Type":"ContainerStarted","Data":"03e3b4576025e9b6fadb5bd88a110176897455b3092bf809e46a8455e9a7fb7d"} Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.375704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xjf9z" event={"ID":"35b1b84e-518a-4567-8ad9-0e717e9958fb","Type":"ContainerStarted","Data":"e34ca1052ecf9f9ba121b58d021896be6547e55cd4575b66a2dd32bfd225328f"} Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.389049 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-dbxkf_3017a611-cb0d-4f79-b6f8-2634dc026e2e/cluster-samples-operator/0.log" Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.390544 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dbxkf" event={"ID":"3017a611-cb0d-4f79-b6f8-2634dc026e2e","Type":"ContainerStarted","Data":"4f741492a038a0d627dbbcf7e12bef3bf0b42dd55e1d540c8c13737f4faaf65e"} Oct 07 17:06:38 crc kubenswrapper[4681]: I1007 17:06:38.398501 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xjf9z" podStartSLOduration=181.398482256 podStartE2EDuration="3m1.398482256s" podCreationTimestamp="2025-10-07 17:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:06:38.395942182 +0000 UTC m=+202.043353757" watchObservedRunningTime="2025-10-07 17:06:38.398482256 +0000 UTC m=+202.045893811" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.405286 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8lxg2" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.405309 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-64mxk" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" Oct 07 17:06:38 crc kubenswrapper[4681]: E1007 17:06:38.409524 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jnv6r" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" Oct 07 17:06:39 crc kubenswrapper[4681]: I1007 17:06:39.395449 4681 generic.go:334] "Generic (PLEG): container finished" podID="2ecebb16-848c-4597-8f86-7779f4c82530" containerID="3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f" exitCode=0 Oct 07 17:06:39 crc kubenswrapper[4681]: I1007 17:06:39.395598 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerDied","Data":"3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f"} Oct 07 17:06:40 crc kubenswrapper[4681]: I1007 17:06:40.403314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerStarted","Data":"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc"} Oct 07 17:06:40 crc kubenswrapper[4681]: I1007 17:06:40.418726 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mj5z5" podStartSLOduration=4.073322062 podStartE2EDuration="54.418710138s" podCreationTimestamp="2025-10-07 17:05:46 +0000 UTC" firstStartedPulling="2025-10-07 17:05:49.669639346 +0000 UTC m=+153.317050901" lastFinishedPulling="2025-10-07 17:06:40.015027422 +0000 UTC m=+203.662438977" observedRunningTime="2025-10-07 17:06:40.416988558 +0000 UTC m=+204.064400113" watchObservedRunningTime="2025-10-07 17:06:40.418710138 +0000 UTC m=+204.066121693" Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.194952 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.195193 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.195256 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.195917 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.196025 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c" gracePeriod=600 Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.416008 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c" exitCode=0 Oct 07 17:06:42 crc kubenswrapper[4681]: I1007 17:06:42.416105 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c"} Oct 07 17:06:43 crc kubenswrapper[4681]: I1007 17:06:43.423832 4681 generic.go:334] "Generic (PLEG): container finished" podID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerID="fb3247a1c61885af0659f1602ab3e021ec519f1774fee86e64c4abbf7d1cd3a5" exitCode=0 Oct 07 17:06:43 crc kubenswrapper[4681]: I1007 17:06:43.424374 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerDied","Data":"fb3247a1c61885af0659f1602ab3e021ec519f1774fee86e64c4abbf7d1cd3a5"} Oct 07 17:06:43 crc kubenswrapper[4681]: I1007 17:06:43.433657 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422"} Oct 07 17:06:44 crc kubenswrapper[4681]: I1007 17:06:44.440150 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerStarted","Data":"9708daad2e691c08a7db9e717464f432b35883e6d923124bacaf62aa7c8a5d73"} Oct 07 17:06:44 crc kubenswrapper[4681]: I1007 17:06:44.461377 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4mrgc" podStartSLOduration=3.014891704 podStartE2EDuration="58.461351726s" podCreationTimestamp="2025-10-07 17:05:46 +0000 UTC" firstStartedPulling="2025-10-07 17:05:48.567322024 +0000 UTC m=+152.214733579" lastFinishedPulling="2025-10-07 17:06:44.013782046 +0000 UTC m=+207.661193601" observedRunningTime="2025-10-07 17:06:44.458609407 +0000 UTC m=+208.106020962" watchObservedRunningTime="2025-10-07 17:06:44.461351726 +0000 UTC m=+208.108763281" Oct 07 17:06:46 crc kubenswrapper[4681]: I1007 17:06:46.648890 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:06:46 crc kubenswrapper[4681]: I1007 17:06:46.649427 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:06:46 crc kubenswrapper[4681]: I1007 17:06:46.822714 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:06:47 crc kubenswrapper[4681]: I1007 17:06:47.248276 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:47 crc kubenswrapper[4681]: I1007 17:06:47.248598 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:47 crc kubenswrapper[4681]: I1007 17:06:47.287838 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:47 crc kubenswrapper[4681]: I1007 17:06:47.492048 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:48 crc kubenswrapper[4681]: I1007 17:06:48.596036 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:06:49 crc kubenswrapper[4681]: I1007 17:06:49.461830 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mj5z5" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="registry-server" containerID="cri-o://e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc" gracePeriod=2 Oct 07 17:06:49 crc kubenswrapper[4681]: I1007 17:06:49.847707 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.036913 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqll\" (UniqueName: \"kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll\") pod \"2ecebb16-848c-4597-8f86-7779f4c82530\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.036981 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content\") pod \"2ecebb16-848c-4597-8f86-7779f4c82530\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.037066 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities\") pod \"2ecebb16-848c-4597-8f86-7779f4c82530\" (UID: \"2ecebb16-848c-4597-8f86-7779f4c82530\") " Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.038021 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities" (OuterVolumeSpecName: "utilities") pod "2ecebb16-848c-4597-8f86-7779f4c82530" (UID: "2ecebb16-848c-4597-8f86-7779f4c82530"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.052124 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll" (OuterVolumeSpecName: "kube-api-access-5wqll") pod "2ecebb16-848c-4597-8f86-7779f4c82530" (UID: "2ecebb16-848c-4597-8f86-7779f4c82530"). InnerVolumeSpecName "kube-api-access-5wqll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.107216 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ecebb16-848c-4597-8f86-7779f4c82530" (UID: "2ecebb16-848c-4597-8f86-7779f4c82530"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.138072 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqll\" (UniqueName: \"kubernetes.io/projected/2ecebb16-848c-4597-8f86-7779f4c82530-kube-api-access-5wqll\") on node \"crc\" DevicePath \"\"" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.138107 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.138119 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ecebb16-848c-4597-8f86-7779f4c82530-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.468780 4681 generic.go:334] "Generic (PLEG): container finished" podID="2ecebb16-848c-4597-8f86-7779f4c82530" containerID="e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc" exitCode=0 Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.468836 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerDied","Data":"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc"} Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.468894 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mj5z5" event={"ID":"2ecebb16-848c-4597-8f86-7779f4c82530","Type":"ContainerDied","Data":"9e112731647abbde06dfff824c0753a337973eb6f7636f07750f88d6c1e8557e"} Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.468919 4681 scope.go:117] "RemoveContainer" containerID="e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.469413 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mj5z5" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.470622 4681 generic.go:334] "Generic (PLEG): container finished" podID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerID="7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70" exitCode=0 Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.470655 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerDied","Data":"7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70"} Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.498614 4681 scope.go:117] "RemoveContainer" containerID="3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.514104 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.533419 4681 scope.go:117] "RemoveContainer" containerID="3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.535378 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mj5z5"] Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.548595 4681 scope.go:117] "RemoveContainer" containerID="e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc" Oct 07 17:06:50 crc kubenswrapper[4681]: E1007 17:06:50.549053 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc\": container with ID starting with e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc not found: ID does not exist" containerID="e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.549089 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc"} err="failed to get container status \"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc\": rpc error: code = NotFound desc = could not find container \"e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc\": container with ID starting with e5761e0689e04476b155e7b08933105fe1e016122e76d44c85ab3d9d1ff9b8cc not found: ID does not exist" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.549115 4681 scope.go:117] "RemoveContainer" containerID="3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f" Oct 07 17:06:50 crc kubenswrapper[4681]: E1007 17:06:50.549565 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f\": container with ID starting with 3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f not found: ID does not exist" containerID="3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.549594 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f"} err="failed to get container status \"3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f\": rpc error: code = NotFound desc = could not find container \"3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f\": container with ID starting with 3fc5014badd5a555359b66c157566fbaec524df0780f9396e4fda120868a1e8f not found: ID does not exist" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.549619 4681 scope.go:117] "RemoveContainer" containerID="3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5" Oct 07 17:06:50 crc kubenswrapper[4681]: E1007 17:06:50.549867 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5\": container with ID starting with 3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5 not found: ID does not exist" containerID="3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5" Oct 07 17:06:50 crc kubenswrapper[4681]: I1007 17:06:50.549906 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5"} err="failed to get container status \"3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5\": rpc error: code = NotFound desc = could not find container \"3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5\": container with ID starting with 3a6869215e814e2e92aab9a538b57f358bbd8642c0b84e8fae0395d4a3a633d5 not found: ID does not exist" Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.043103 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" path="/var/lib/kubelet/pods/2ecebb16-848c-4597-8f86-7779f4c82530/volumes" Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.479011 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerID="b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af" exitCode=0 Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.479147 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerDied","Data":"b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af"} Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.484728 4681 generic.go:334] "Generic (PLEG): container finished" podID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerID="15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7" exitCode=0 Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.484906 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerDied","Data":"15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7"} Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.497507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerStarted","Data":"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5"} Oct 07 17:06:51 crc kubenswrapper[4681]: I1007 17:06:51.529001 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64mxk" podStartSLOduration=2.2175998 podStartE2EDuration="1m3.528985828s" podCreationTimestamp="2025-10-07 17:05:48 +0000 UTC" firstStartedPulling="2025-10-07 17:05:49.742053518 +0000 UTC m=+153.389465073" lastFinishedPulling="2025-10-07 17:06:51.053439546 +0000 UTC m=+214.700851101" observedRunningTime="2025-10-07 17:06:51.528120383 +0000 UTC m=+215.175531968" watchObservedRunningTime="2025-10-07 17:06:51.528985828 +0000 UTC m=+215.176397383" Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.504699 4681 generic.go:334] "Generic (PLEG): container finished" podID="390445e9-214f-423d-b39d-9411ca5cf099" containerID="c7706e712721246ae55ae5768a4c11db99c8f14804ea649a3c2e04f4d541fa7d" exitCode=0 Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.504789 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerDied","Data":"c7706e712721246ae55ae5768a4c11db99c8f14804ea649a3c2e04f4d541fa7d"} Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.507168 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerStarted","Data":"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a"} Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.512058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerStarted","Data":"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27"} Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.516267 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerStarted","Data":"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1"} Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.543942 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnv6r" podStartSLOduration=3.341000914 podStartE2EDuration="1m4.543923144s" podCreationTimestamp="2025-10-07 17:05:48 +0000 UTC" firstStartedPulling="2025-10-07 17:05:50.76855562 +0000 UTC m=+154.415967175" lastFinishedPulling="2025-10-07 17:06:51.97147785 +0000 UTC m=+215.618889405" observedRunningTime="2025-10-07 17:06:52.54102017 +0000 UTC m=+216.188431745" watchObservedRunningTime="2025-10-07 17:06:52.543923144 +0000 UTC m=+216.191334699" Oct 07 17:06:52 crc kubenswrapper[4681]: I1007 17:06:52.578728 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cvgwt" podStartSLOduration=4.281714811 podStartE2EDuration="1m6.578710104s" podCreationTimestamp="2025-10-07 17:05:46 +0000 UTC" firstStartedPulling="2025-10-07 17:05:49.730247516 +0000 UTC m=+153.377659071" lastFinishedPulling="2025-10-07 17:06:52.027242809 +0000 UTC m=+215.674654364" observedRunningTime="2025-10-07 17:06:52.575348446 +0000 UTC m=+216.222760001" watchObservedRunningTime="2025-10-07 17:06:52.578710104 +0000 UTC m=+216.226121649" Oct 07 17:06:53 crc kubenswrapper[4681]: I1007 17:06:53.522587 4681 generic.go:334] "Generic (PLEG): container finished" podID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerID="ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1" exitCode=0 Oct 07 17:06:53 crc kubenswrapper[4681]: I1007 17:06:53.522642 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerDied","Data":"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1"} Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.539211 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerStarted","Data":"82f446b3e202f296984385071a20f4ecd4ad5cae4b4739b020d223d8000196e4"} Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.541391 4681 generic.go:334] "Generic (PLEG): container finished" podID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerID="a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92" exitCode=0 Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.541459 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerDied","Data":"a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92"} Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.544524 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerStarted","Data":"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d"} Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.593424 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lxg2" podStartSLOduration=2.943897872 podStartE2EDuration="1m7.593405781s" podCreationTimestamp="2025-10-07 17:05:49 +0000 UTC" firstStartedPulling="2025-10-07 17:05:50.806025618 +0000 UTC m=+154.453437173" lastFinishedPulling="2025-10-07 17:06:55.455533537 +0000 UTC m=+219.102945082" observedRunningTime="2025-10-07 17:06:56.591802684 +0000 UTC m=+220.239214239" watchObservedRunningTime="2025-10-07 17:06:56.593405781 +0000 UTC m=+220.240817336" Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.595076 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-svcfw" podStartSLOduration=4.855532885 podStartE2EDuration="1m10.595067139s" podCreationTimestamp="2025-10-07 17:05:46 +0000 UTC" firstStartedPulling="2025-10-07 17:05:49.721460371 +0000 UTC m=+153.368871926" lastFinishedPulling="2025-10-07 17:06:55.460994625 +0000 UTC m=+219.108406180" observedRunningTime="2025-10-07 17:06:56.570234098 +0000 UTC m=+220.217645653" watchObservedRunningTime="2025-10-07 17:06:56.595067139 +0000 UTC m=+220.242478694" Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.686428 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.796317 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:06:56 crc kubenswrapper[4681]: I1007 17:06:56.796365 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:06:57 crc kubenswrapper[4681]: I1007 17:06:57.008764 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:06:57 crc kubenswrapper[4681]: I1007 17:06:57.009151 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:06:57 crc kubenswrapper[4681]: I1007 17:06:57.064463 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:06:57 crc kubenswrapper[4681]: I1007 17:06:57.641461 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:06:57 crc kubenswrapper[4681]: I1007 17:06:57.832926 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-svcfw" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="registry-server" probeResult="failure" output=< Oct 07 17:06:57 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:06:57 crc kubenswrapper[4681]: > Oct 07 17:06:58 crc kubenswrapper[4681]: I1007 17:06:58.555222 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerStarted","Data":"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9"} Oct 07 17:06:58 crc kubenswrapper[4681]: I1007 17:06:58.796584 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:06:58 crc kubenswrapper[4681]: I1007 17:06:58.796645 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:06:58 crc kubenswrapper[4681]: I1007 17:06:58.839820 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:06:58 crc kubenswrapper[4681]: I1007 17:06:58.857865 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f9d22" podStartSLOduration=3.800943104 podStartE2EDuration="1m9.857848131s" podCreationTimestamp="2025-10-07 17:05:49 +0000 UTC" firstStartedPulling="2025-10-07 17:05:51.831165079 +0000 UTC m=+155.478576634" lastFinishedPulling="2025-10-07 17:06:57.888070106 +0000 UTC m=+221.535481661" observedRunningTime="2025-10-07 17:06:58.573702665 +0000 UTC m=+222.221114220" watchObservedRunningTime="2025-10-07 17:06:58.857848131 +0000 UTC m=+222.505259686" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.213476 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.214277 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.257173 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.616298 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.628751 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.781449 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:06:59 crc kubenswrapper[4681]: I1007 17:06:59.781484 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:07:00 crc kubenswrapper[4681]: I1007 17:07:00.189029 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:00 crc kubenswrapper[4681]: I1007 17:07:00.189923 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:00 crc kubenswrapper[4681]: I1007 17:07:00.598215 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:07:00 crc kubenswrapper[4681]: I1007 17:07:00.599036 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cvgwt" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="registry-server" containerID="cri-o://0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27" gracePeriod=2 Oct 07 17:07:00 crc kubenswrapper[4681]: I1007 17:07:00.836797 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lxg2" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="registry-server" probeResult="failure" output=< Oct 07 17:07:00 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:07:00 crc kubenswrapper[4681]: > Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.095950 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.171306 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs7d4\" (UniqueName: \"kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4\") pod \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.171378 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities\") pod \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.171454 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content\") pod \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\" (UID: \"c8118bad-bdb8-4a82-aef8-a70d685fe13a\") " Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.172073 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities" (OuterVolumeSpecName: "utilities") pod "c8118bad-bdb8-4a82-aef8-a70d685fe13a" (UID: "c8118bad-bdb8-4a82-aef8-a70d685fe13a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.180516 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4" (OuterVolumeSpecName: "kube-api-access-xs7d4") pod "c8118bad-bdb8-4a82-aef8-a70d685fe13a" (UID: "c8118bad-bdb8-4a82-aef8-a70d685fe13a"). InnerVolumeSpecName "kube-api-access-xs7d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.243023 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8118bad-bdb8-4a82-aef8-a70d685fe13a" (UID: "c8118bad-bdb8-4a82-aef8-a70d685fe13a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.244502 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f9d22" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="registry-server" probeResult="failure" output=< Oct 07 17:07:01 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:07:01 crc kubenswrapper[4681]: > Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.272300 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.272338 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs7d4\" (UniqueName: \"kubernetes.io/projected/c8118bad-bdb8-4a82-aef8-a70d685fe13a-kube-api-access-xs7d4\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.272353 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8118bad-bdb8-4a82-aef8-a70d685fe13a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.572386 4681 generic.go:334] "Generic (PLEG): container finished" podID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerID="0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27" exitCode=0 Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.572599 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerDied","Data":"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27"} Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.572641 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cvgwt" event={"ID":"c8118bad-bdb8-4a82-aef8-a70d685fe13a","Type":"ContainerDied","Data":"55bfe034e893a89a1c377b29c55f9249eccacca3865af67bc53ab115d7b480e4"} Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.572669 4681 scope.go:117] "RemoveContainer" containerID="0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.572724 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cvgwt" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.586758 4681 scope.go:117] "RemoveContainer" containerID="15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.596776 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.599667 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cvgwt"] Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.613915 4681 scope.go:117] "RemoveContainer" containerID="c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.632165 4681 scope.go:117] "RemoveContainer" containerID="0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27" Oct 07 17:07:01 crc kubenswrapper[4681]: E1007 17:07:01.632677 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27\": container with ID starting with 0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27 not found: ID does not exist" containerID="0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.632701 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27"} err="failed to get container status \"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27\": rpc error: code = NotFound desc = could not find container \"0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27\": container with ID starting with 0e7fd62e794b4c54d2c73acd557c0dd1254720409d1a2e084237ff11ef194b27 not found: ID does not exist" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.632721 4681 scope.go:117] "RemoveContainer" containerID="15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7" Oct 07 17:07:01 crc kubenswrapper[4681]: E1007 17:07:01.633161 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7\": container with ID starting with 15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7 not found: ID does not exist" containerID="15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.633177 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7"} err="failed to get container status \"15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7\": rpc error: code = NotFound desc = could not find container \"15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7\": container with ID starting with 15602a6f4888dbcf2b3fc01f42b487112cfa63fab146217643336d79661cb2f7 not found: ID does not exist" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.633191 4681 scope.go:117] "RemoveContainer" containerID="c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38" Oct 07 17:07:01 crc kubenswrapper[4681]: E1007 17:07:01.633548 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38\": container with ID starting with c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38 not found: ID does not exist" containerID="c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38" Oct 07 17:07:01 crc kubenswrapper[4681]: I1007 17:07:01.633566 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38"} err="failed to get container status \"c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38\": rpc error: code = NotFound desc = could not find container \"c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38\": container with ID starting with c436777cbef8bbb19979981f72d809cb2bb2886a456af59a415e7b59b43dab38 not found: ID does not exist" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.035355 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" path="/var/lib/kubelet/pods/c8118bad-bdb8-4a82-aef8-a70d685fe13a/volumes" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.199901 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.200631 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jnv6r" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="registry-server" containerID="cri-o://299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a" gracePeriod=2 Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.686757 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ltnkf"] Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687166 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.687265 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687354 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="extract-content" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.687434 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="extract-content" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687504 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41740c78-48da-45f3-9b5a-dd196f55ad8f" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.687566 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="41740c78-48da-45f3-9b5a-dd196f55ad8f" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687671 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.687739 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687808 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc28b46-2a9f-4141-8e65-a9c956e0f261" containerName="collect-profiles" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.687897 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc28b46-2a9f-4141-8e65-a9c956e0f261" containerName="collect-profiles" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.687975 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688038 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.688093 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="extract-content" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688172 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="extract-content" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.688239 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="extract-utilities" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688305 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="extract-utilities" Oct 07 17:07:03 crc kubenswrapper[4681]: E1007 17:07:03.688378 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="extract-utilities" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688440 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="extract-utilities" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688603 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecebb16-848c-4597-8f86-7779f4c82530" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688674 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc28b46-2a9f-4141-8e65-a9c956e0f261" containerName="collect-profiles" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688739 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8118bad-bdb8-4a82-aef8-a70d685fe13a" containerName="registry-server" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688810 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="41740c78-48da-45f3-9b5a-dd196f55ad8f" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.688891 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="000d7b74-58d1-4dc0-9014-8fe2f5b9d8c3" containerName="pruner" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.689325 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.706719 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ltnkf"] Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805412 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-registry-tls\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805465 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-bound-sa-token\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805489 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7769bfca-e003-4ec4-8bde-d7419b350983-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805507 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2zk\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-kube-api-access-5r2zk\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805597 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805617 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7769bfca-e003-4ec4-8bde-d7419b350983-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-trusted-ca\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.805676 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-registry-certificates\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.827424 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906782 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7769bfca-e003-4ec4-8bde-d7419b350983-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906840 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-trusted-ca\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906858 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-registry-certificates\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-registry-tls\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906950 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-bound-sa-token\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906966 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7769bfca-e003-4ec4-8bde-d7419b350983-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.906985 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2zk\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-kube-api-access-5r2zk\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.907287 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7769bfca-e003-4ec4-8bde-d7419b350983-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.908936 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-trusted-ca\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.909575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7769bfca-e003-4ec4-8bde-d7419b350983-registry-certificates\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.913527 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-registry-tls\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.915371 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7769bfca-e003-4ec4-8bde-d7419b350983-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.924654 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-bound-sa-token\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:03 crc kubenswrapper[4681]: I1007 17:07:03.925032 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2zk\" (UniqueName: \"kubernetes.io/projected/7769bfca-e003-4ec4-8bde-d7419b350983-kube-api-access-5r2zk\") pod \"image-registry-66df7c8f76-ltnkf\" (UID: \"7769bfca-e003-4ec4-8bde-d7419b350983\") " pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:04 crc kubenswrapper[4681]: I1007 17:07:04.001542 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:04 crc kubenswrapper[4681]: I1007 17:07:04.393066 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ltnkf"] Oct 07 17:07:04 crc kubenswrapper[4681]: W1007 17:07:04.398429 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7769bfca_e003_4ec4_8bde_d7419b350983.slice/crio-83cc861b43a5d3dda17b4732cbf3176d7145eff7a841e18eeeab3260cdd974b6 WatchSource:0}: Error finding container 83cc861b43a5d3dda17b4732cbf3176d7145eff7a841e18eeeab3260cdd974b6: Status 404 returned error can't find the container with id 83cc861b43a5d3dda17b4732cbf3176d7145eff7a841e18eeeab3260cdd974b6 Oct 07 17:07:04 crc kubenswrapper[4681]: I1007 17:07:04.586555 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" event={"ID":"7769bfca-e003-4ec4-8bde-d7419b350983","Type":"ContainerStarted","Data":"83cc861b43a5d3dda17b4732cbf3176d7145eff7a841e18eeeab3260cdd974b6"} Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.456266 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.593990 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerID="299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a" exitCode=0 Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.594569 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerDied","Data":"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a"} Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.594696 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnv6r" event={"ID":"fb8f4ff7-5170-425e-855c-0684f4bdf34b","Type":"ContainerDied","Data":"1a02f645c00d1a01d061655fb63f45a3c8941044064ca9950968a58377feac90"} Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.594801 4681 scope.go:117] "RemoveContainer" containerID="299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.595013 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnv6r" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.599108 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" event={"ID":"7769bfca-e003-4ec4-8bde-d7419b350983","Type":"ContainerStarted","Data":"d42da65954a269e3a232aeb03ecbe58700ee1090048b2d4a3229134dd8d19bc6"} Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.599868 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.619725 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" podStartSLOduration=2.619705038 podStartE2EDuration="2.619705038s" podCreationTimestamp="2025-10-07 17:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:07:05.614099836 +0000 UTC m=+229.261511391" watchObservedRunningTime="2025-10-07 17:07:05.619705038 +0000 UTC m=+229.267116603" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.625660 4681 scope.go:117] "RemoveContainer" containerID="b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.643032 4681 scope.go:117] "RemoveContainer" containerID="3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.643242 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities\") pod \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.643345 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content\") pod \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.643425 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvvcz\" (UniqueName: \"kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz\") pod \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\" (UID: \"fb8f4ff7-5170-425e-855c-0684f4bdf34b\") " Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.644308 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities" (OuterVolumeSpecName: "utilities") pod "fb8f4ff7-5170-425e-855c-0684f4bdf34b" (UID: "fb8f4ff7-5170-425e-855c-0684f4bdf34b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.652915 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz" (OuterVolumeSpecName: "kube-api-access-jvvcz") pod "fb8f4ff7-5170-425e-855c-0684f4bdf34b" (UID: "fb8f4ff7-5170-425e-855c-0684f4bdf34b"). InnerVolumeSpecName "kube-api-access-jvvcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.660712 4681 scope.go:117] "RemoveContainer" containerID="299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a" Oct 07 17:07:05 crc kubenswrapper[4681]: E1007 17:07:05.662077 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a\": container with ID starting with 299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a not found: ID does not exist" containerID="299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.662132 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a"} err="failed to get container status \"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a\": rpc error: code = NotFound desc = could not find container \"299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a\": container with ID starting with 299057c1bd3802ccae575b82d569e51d605da84d43a8327235f8af4e135cb66a not found: ID does not exist" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.662164 4681 scope.go:117] "RemoveContainer" containerID="b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.662489 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb8f4ff7-5170-425e-855c-0684f4bdf34b" (UID: "fb8f4ff7-5170-425e-855c-0684f4bdf34b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:05 crc kubenswrapper[4681]: E1007 17:07:05.663207 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af\": container with ID starting with b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af not found: ID does not exist" containerID="b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.663258 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af"} err="failed to get container status \"b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af\": rpc error: code = NotFound desc = could not find container \"b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af\": container with ID starting with b66a5fe5250983b3653bfe4d371513d34753c302d47bc5a3d08b8073966026af not found: ID does not exist" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.663284 4681 scope.go:117] "RemoveContainer" containerID="3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5" Oct 07 17:07:05 crc kubenswrapper[4681]: E1007 17:07:05.663654 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5\": container with ID starting with 3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5 not found: ID does not exist" containerID="3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.664038 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5"} err="failed to get container status \"3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5\": rpc error: code = NotFound desc = could not find container \"3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5\": container with ID starting with 3166eb7498ec67156386892e9eb31db962919b4313ef2c32a8f5118e1843e5e5 not found: ID does not exist" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.747135 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvvcz\" (UniqueName: \"kubernetes.io/projected/fb8f4ff7-5170-425e-855c-0684f4bdf34b-kube-api-access-jvvcz\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.747162 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.747171 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb8f4ff7-5170-425e-855c-0684f4bdf34b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.920489 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:07:05 crc kubenswrapper[4681]: I1007 17:07:05.925855 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnv6r"] Oct 07 17:07:06 crc kubenswrapper[4681]: I1007 17:07:06.839042 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:07:06 crc kubenswrapper[4681]: I1007 17:07:06.878146 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:07:07 crc kubenswrapper[4681]: I1007 17:07:07.035756 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" path="/var/lib/kubelet/pods/fb8f4ff7-5170-425e-855c-0684f4bdf34b/volumes" Oct 07 17:07:09 crc kubenswrapper[4681]: I1007 17:07:09.053226 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:07:09 crc kubenswrapper[4681]: I1007 17:07:09.822743 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:07:09 crc kubenswrapper[4681]: I1007 17:07:09.863649 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:07:10 crc kubenswrapper[4681]: I1007 17:07:10.234713 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:10 crc kubenswrapper[4681]: I1007 17:07:10.284849 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.500564 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.501397 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-svcfw" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="registry-server" containerID="cri-o://82f446b3e202f296984385071a20f4ecd4ad5cae4b4739b020d223d8000196e4" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.510691 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.511032 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4mrgc" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="registry-server" containerID="cri-o://9708daad2e691c08a7db9e717464f432b35883e6d923124bacaf62aa7c8a5d73" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.520759 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.521081 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" containerID="cri-o://bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.525838 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.526292 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64mxk" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="registry-server" containerID="cri-o://348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.537934 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.545847 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kbm6c"] Oct 07 17:07:11 crc kubenswrapper[4681]: E1007 17:07:11.546079 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="extract-content" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.546095 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="extract-content" Oct 07 17:07:11 crc kubenswrapper[4681]: E1007 17:07:11.546102 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="extract-utilities" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.546108 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="extract-utilities" Oct 07 17:07:11 crc kubenswrapper[4681]: E1007 17:07:11.546128 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="registry-server" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.546135 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="registry-server" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.546237 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8f4ff7-5170-425e-855c-0684f4bdf34b" containerName="registry-server" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.546623 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.558024 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.561420 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kbm6c"] Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.642171 4681 generic.go:334] "Generic (PLEG): container finished" podID="390445e9-214f-423d-b39d-9411ca5cf099" containerID="82f446b3e202f296984385071a20f4ecd4ad5cae4b4739b020d223d8000196e4" exitCode=0 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.642358 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerDied","Data":"82f446b3e202f296984385071a20f4ecd4ad5cae4b4739b020d223d8000196e4"} Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.645341 4681 generic.go:334] "Generic (PLEG): container finished" podID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerID="9708daad2e691c08a7db9e717464f432b35883e6d923124bacaf62aa7c8a5d73" exitCode=0 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.645551 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lxg2" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="registry-server" containerID="cri-o://ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.646030 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerDied","Data":"9708daad2e691c08a7db9e717464f432b35883e6d923124bacaf62aa7c8a5d73"} Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.646327 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f9d22" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="registry-server" containerID="cri-o://8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9" gracePeriod=30 Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.723276 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.723341 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2t7\" (UniqueName: \"kubernetes.io/projected/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-kube-api-access-8b2t7\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.723367 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.824101 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.825245 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2t7\" (UniqueName: \"kubernetes.io/projected/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-kube-api-access-8b2t7\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.825293 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.827429 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.843788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2t7\" (UniqueName: \"kubernetes.io/projected/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-kube-api-access-8b2t7\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.844664 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kbm6c\" (UID: \"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9\") " pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.867158 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:11 crc kubenswrapper[4681]: I1007 17:07:11.937416 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.034725 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content\") pod \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.034834 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlw5n\" (UniqueName: \"kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n\") pod \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.034864 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities\") pod \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\" (UID: \"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.035802 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities" (OuterVolumeSpecName: "utilities") pod "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" (UID: "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.042826 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n" (OuterVolumeSpecName: "kube-api-access-hlw5n") pod "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" (UID: "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7"). InnerVolumeSpecName "kube-api-access-hlw5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.044109 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.136475 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics\") pod \"61783842-70fb-40b1-bc57-f614ca527168\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.136516 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7qh4\" (UniqueName: \"kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4\") pod \"61783842-70fb-40b1-bc57-f614ca527168\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.136603 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca\") pod \"61783842-70fb-40b1-bc57-f614ca527168\" (UID: \"61783842-70fb-40b1-bc57-f614ca527168\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.136789 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlw5n\" (UniqueName: \"kubernetes.io/projected/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-kube-api-access-hlw5n\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.136804 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.137385 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "61783842-70fb-40b1-bc57-f614ca527168" (UID: "61783842-70fb-40b1-bc57-f614ca527168"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.146243 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "61783842-70fb-40b1-bc57-f614ca527168" (UID: "61783842-70fb-40b1-bc57-f614ca527168"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.147208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" (UID: "29ae017e-1bbd-4cf3-bda3-5fd9a25866c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.165868 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4" (OuterVolumeSpecName: "kube-api-access-m7qh4") pod "61783842-70fb-40b1-bc57-f614ca527168" (UID: "61783842-70fb-40b1-bc57-f614ca527168"). InnerVolumeSpecName "kube-api-access-m7qh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.225244 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.229649 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.237802 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61783842-70fb-40b1-bc57-f614ca527168-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.237835 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.237845 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/61783842-70fb-40b1-bc57-f614ca527168-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.237854 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7qh4\" (UniqueName: \"kubernetes.io/projected/61783842-70fb-40b1-bc57-f614ca527168-kube-api-access-m7qh4\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.251069 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.266196 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.316093 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kbm6c"] Oct 07 17:07:12 crc kubenswrapper[4681]: W1007 17:07:12.323722 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bf28f5b_4ee1_444b_ad73_7d63ecbd05c9.slice/crio-14ca8a266153fa8d7bca3da7bb629728a26bd84150c1dfe12b50011e0fe0366d WatchSource:0}: Error finding container 14ca8a266153fa8d7bca3da7bb629728a26bd84150c1dfe12b50011e0fe0366d: Status 404 returned error can't find the container with id 14ca8a266153fa8d7bca3da7bb629728a26bd84150c1dfe12b50011e0fe0366d Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.338796 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmd5p\" (UniqueName: \"kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p\") pod \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.338864 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content\") pod \"eab9b71a-d59d-436b-8c0b-c62801ea9326\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.338910 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities\") pod \"eab9b71a-d59d-436b-8c0b-c62801ea9326\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.338927 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities\") pod \"eda3a2b0-8a17-40b1-b463-7b98159360db\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.338946 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm2q4\" (UniqueName: \"kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4\") pod \"eab9b71a-d59d-436b-8c0b-c62801ea9326\" (UID: \"eab9b71a-d59d-436b-8c0b-c62801ea9326\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339019 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqzlc\" (UniqueName: \"kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc\") pod \"eda3a2b0-8a17-40b1-b463-7b98159360db\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339039 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content\") pod \"eda3a2b0-8a17-40b1-b463-7b98159360db\" (UID: \"eda3a2b0-8a17-40b1-b463-7b98159360db\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339108 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content\") pod \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339145 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities\") pod \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\" (UID: \"dbfe57d1-0360-4f50-b36c-cc80a36f868e\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339556 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities" (OuterVolumeSpecName: "utilities") pod "eab9b71a-d59d-436b-8c0b-c62801ea9326" (UID: "eab9b71a-d59d-436b-8c0b-c62801ea9326"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.339853 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities" (OuterVolumeSpecName: "utilities") pod "eda3a2b0-8a17-40b1-b463-7b98159360db" (UID: "eda3a2b0-8a17-40b1-b463-7b98159360db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.340353 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities" (OuterVolumeSpecName: "utilities") pod "dbfe57d1-0360-4f50-b36c-cc80a36f868e" (UID: "dbfe57d1-0360-4f50-b36c-cc80a36f868e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.352757 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p" (OuterVolumeSpecName: "kube-api-access-rmd5p") pod "dbfe57d1-0360-4f50-b36c-cc80a36f868e" (UID: "dbfe57d1-0360-4f50-b36c-cc80a36f868e"). InnerVolumeSpecName "kube-api-access-rmd5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.353374 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4" (OuterVolumeSpecName: "kube-api-access-cm2q4") pod "eab9b71a-d59d-436b-8c0b-c62801ea9326" (UID: "eab9b71a-d59d-436b-8c0b-c62801ea9326"). InnerVolumeSpecName "kube-api-access-cm2q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.353335 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc" (OuterVolumeSpecName: "kube-api-access-wqzlc") pod "eda3a2b0-8a17-40b1-b463-7b98159360db" (UID: "eda3a2b0-8a17-40b1-b463-7b98159360db"). InnerVolumeSpecName "kube-api-access-wqzlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354310 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbfe57d1-0360-4f50-b36c-cc80a36f868e" (UID: "dbfe57d1-0360-4f50-b36c-cc80a36f868e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354471 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354486 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbfe57d1-0360-4f50-b36c-cc80a36f868e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354516 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmd5p\" (UniqueName: \"kubernetes.io/projected/dbfe57d1-0360-4f50-b36c-cc80a36f868e-kube-api-access-rmd5p\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354528 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354536 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354545 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm2q4\" (UniqueName: \"kubernetes.io/projected/eab9b71a-d59d-436b-8c0b-c62801ea9326-kube-api-access-cm2q4\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.354554 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqzlc\" (UniqueName: \"kubernetes.io/projected/eda3a2b0-8a17-40b1-b463-7b98159360db-kube-api-access-wqzlc\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.428407 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eab9b71a-d59d-436b-8c0b-c62801ea9326" (UID: "eab9b71a-d59d-436b-8c0b-c62801ea9326"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.455767 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7psn5\" (UniqueName: \"kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5\") pod \"390445e9-214f-423d-b39d-9411ca5cf099\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.455885 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities\") pod \"390445e9-214f-423d-b39d-9411ca5cf099\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.455948 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content\") pod \"390445e9-214f-423d-b39d-9411ca5cf099\" (UID: \"390445e9-214f-423d-b39d-9411ca5cf099\") " Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.456168 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab9b71a-d59d-436b-8c0b-c62801ea9326-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.457633 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities" (OuterVolumeSpecName: "utilities") pod "390445e9-214f-423d-b39d-9411ca5cf099" (UID: "390445e9-214f-423d-b39d-9411ca5cf099"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.464160 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5" (OuterVolumeSpecName: "kube-api-access-7psn5") pod "390445e9-214f-423d-b39d-9411ca5cf099" (UID: "390445e9-214f-423d-b39d-9411ca5cf099"). InnerVolumeSpecName "kube-api-access-7psn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.469325 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eda3a2b0-8a17-40b1-b463-7b98159360db" (UID: "eda3a2b0-8a17-40b1-b463-7b98159360db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.497764 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "390445e9-214f-423d-b39d-9411ca5cf099" (UID: "390445e9-214f-423d-b39d-9411ca5cf099"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.557686 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7psn5\" (UniqueName: \"kubernetes.io/projected/390445e9-214f-423d-b39d-9411ca5cf099-kube-api-access-7psn5\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.557732 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda3a2b0-8a17-40b1-b463-7b98159360db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.557742 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.557749 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390445e9-214f-423d-b39d-9411ca5cf099-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.652162 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mrgc" event={"ID":"29ae017e-1bbd-4cf3-bda3-5fd9a25866c7","Type":"ContainerDied","Data":"ec0936ae1068a70aca9997ea6801b5a4d6464ae2be034d86021d1d1609bea179"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.652240 4681 scope.go:117] "RemoveContainer" containerID="9708daad2e691c08a7db9e717464f432b35883e6d923124bacaf62aa7c8a5d73" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.652345 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mrgc" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.654153 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" event={"ID":"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9","Type":"ContainerStarted","Data":"1a5e2e42d77c368dc50a7c5d6c24db338b30b08a9e2a6080ea446eeb7eaf6445"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.654190 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" event={"ID":"4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9","Type":"ContainerStarted","Data":"14ca8a266153fa8d7bca3da7bb629728a26bd84150c1dfe12b50011e0fe0366d"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.654675 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.655840 4681 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kbm6c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.655913 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" podUID="4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.658513 4681 generic.go:334] "Generic (PLEG): container finished" podID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerID="8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9" exitCode=0 Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.658557 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerDied","Data":"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.658581 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f9d22" event={"ID":"eda3a2b0-8a17-40b1-b463-7b98159360db","Type":"ContainerDied","Data":"05350b882f1004cbff3e3215251ec9af8eae2b7e5ed57571c74f22947a733dd1"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.658638 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f9d22" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.669384 4681 generic.go:334] "Generic (PLEG): container finished" podID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerID="ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d" exitCode=0 Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.669460 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerDied","Data":"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.669511 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lxg2" event={"ID":"eab9b71a-d59d-436b-8c0b-c62801ea9326","Type":"ContainerDied","Data":"e326d2f00b7e4f8e427b18edc5940ba09611546146b5aa85fd3fa7cd55d54b40"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.669588 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lxg2" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.680146 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" podStartSLOduration=1.6801291699999998 podStartE2EDuration="1.68012917s" podCreationTimestamp="2025-10-07 17:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:07:12.677167584 +0000 UTC m=+236.324579149" watchObservedRunningTime="2025-10-07 17:07:12.68012917 +0000 UTC m=+236.327540725" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.682284 4681 scope.go:117] "RemoveContainer" containerID="fb3247a1c61885af0659f1602ab3e021ec519f1774fee86e64c4abbf7d1cd3a5" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.689163 4681 generic.go:334] "Generic (PLEG): container finished" podID="61783842-70fb-40b1-bc57-f614ca527168" containerID="bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45" exitCode=0 Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.689240 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" event={"ID":"61783842-70fb-40b1-bc57-f614ca527168","Type":"ContainerDied","Data":"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.689269 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" event={"ID":"61783842-70fb-40b1-bc57-f614ca527168","Type":"ContainerDied","Data":"92a72098a7ac8ed3169de6adac3c31ab6cd5e89ba28d88dd5cb7b32a33efd9a6"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.691037 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hq69t" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.695804 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svcfw" event={"ID":"390445e9-214f-423d-b39d-9411ca5cf099","Type":"ContainerDied","Data":"95c86c1191ef5e01212c04203e4320d28e7ea09ae385b5fb3dcff574726791fb"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.695937 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svcfw" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.697577 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.700736 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4mrgc"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.701844 4681 generic.go:334] "Generic (PLEG): container finished" podID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerID="348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5" exitCode=0 Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.701923 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64mxk" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.701940 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerDied","Data":"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.701981 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64mxk" event={"ID":"dbfe57d1-0360-4f50-b36c-cc80a36f868e","Type":"ContainerDied","Data":"bd250fbc9ef6a2bdb50bf5820df03781d4d76e7e81ced7222a5c312f2d304ae9"} Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.716566 4681 scope.go:117] "RemoveContainer" containerID="3351bbf0473ce77bdf501af4ba26589490a1caba900809edc6a0d7a99b32754e" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.734565 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.738313 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lxg2"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.754047 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.759919 4681 scope.go:117] "RemoveContainer" containerID="8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.760764 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f9d22"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.777302 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.786380 4681 scope.go:117] "RemoveContainer" containerID="a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.789624 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64mxk"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.791603 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.800926 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-svcfw"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.808445 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.831637 4681 scope.go:117] "RemoveContainer" containerID="11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.831814 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hq69t"] Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.856744 4681 scope.go:117] "RemoveContainer" containerID="8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.857290 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9\": container with ID starting with 8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9 not found: ID does not exist" containerID="8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.857325 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9"} err="failed to get container status \"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9\": rpc error: code = NotFound desc = could not find container \"8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9\": container with ID starting with 8b25ce64e800a715dd075ae661e49b711ad6de50b8f75b8e2fe8b424c148afa9 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.857347 4681 scope.go:117] "RemoveContainer" containerID="a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.857684 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92\": container with ID starting with a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92 not found: ID does not exist" containerID="a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.857723 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92"} err="failed to get container status \"a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92\": rpc error: code = NotFound desc = could not find container \"a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92\": container with ID starting with a02780ccfc64bb506b8f324c2778c2bff70cebe91951b030b8a5b69a112c4a92 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.857753 4681 scope.go:117] "RemoveContainer" containerID="11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.858022 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1\": container with ID starting with 11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1 not found: ID does not exist" containerID="11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.858040 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1"} err="failed to get container status \"11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1\": rpc error: code = NotFound desc = could not find container \"11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1\": container with ID starting with 11f1122cfd69ef2836aa6b430ca28182f2e3c008fc98facfd19b82cc5d6c87e1 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.858054 4681 scope.go:117] "RemoveContainer" containerID="ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.878526 4681 scope.go:117] "RemoveContainer" containerID="ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.893718 4681 scope.go:117] "RemoveContainer" containerID="46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.906294 4681 scope.go:117] "RemoveContainer" containerID="ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.906780 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d\": container with ID starting with ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d not found: ID does not exist" containerID="ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.906840 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d"} err="failed to get container status \"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d\": rpc error: code = NotFound desc = could not find container \"ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d\": container with ID starting with ee5e41da06f463e01ac03ab2096d20906f8c521730e0f67671d59d886fde2d5d not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.907471 4681 scope.go:117] "RemoveContainer" containerID="ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.907956 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1\": container with ID starting with ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1 not found: ID does not exist" containerID="ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.907986 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1"} err="failed to get container status \"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1\": rpc error: code = NotFound desc = could not find container \"ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1\": container with ID starting with ca9565b683133920f4bd8af30476a3682b5f9fb9fe337c0733d5c538bf05c6c1 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.908005 4681 scope.go:117] "RemoveContainer" containerID="46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.908378 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7\": container with ID starting with 46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7 not found: ID does not exist" containerID="46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.908428 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7"} err="failed to get container status \"46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7\": rpc error: code = NotFound desc = could not find container \"46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7\": container with ID starting with 46552109765a34afe511792ef8afaa3d18bb3194e5d9ac6ce323f1405b0e46c7 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.908454 4681 scope.go:117] "RemoveContainer" containerID="bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.921081 4681 scope.go:117] "RemoveContainer" containerID="bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45" Oct 07 17:07:12 crc kubenswrapper[4681]: E1007 17:07:12.921601 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45\": container with ID starting with bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45 not found: ID does not exist" containerID="bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.921628 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45"} err="failed to get container status \"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45\": rpc error: code = NotFound desc = could not find container \"bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45\": container with ID starting with bada8967ed476c5e7fdf78792d0cc48002e43135934a849fbf6789e6c5ef7c45 not found: ID does not exist" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.921647 4681 scope.go:117] "RemoveContainer" containerID="82f446b3e202f296984385071a20f4ecd4ad5cae4b4739b020d223d8000196e4" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.935020 4681 scope.go:117] "RemoveContainer" containerID="c7706e712721246ae55ae5768a4c11db99c8f14804ea649a3c2e04f4d541fa7d" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.953685 4681 scope.go:117] "RemoveContainer" containerID="20cb1d15fe54a02f9ef0be57f0f914459cf0dc514c8f9beb3ed9f86ec7a11e7d" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.967564 4681 scope.go:117] "RemoveContainer" containerID="348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.981975 4681 scope.go:117] "RemoveContainer" containerID="7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70" Oct 07 17:07:12 crc kubenswrapper[4681]: I1007 17:07:12.999591 4681 scope.go:117] "RemoveContainer" containerID="5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.011892 4681 scope.go:117] "RemoveContainer" containerID="348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5" Oct 07 17:07:13 crc kubenswrapper[4681]: E1007 17:07:13.012353 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5\": container with ID starting with 348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5 not found: ID does not exist" containerID="348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.012387 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5"} err="failed to get container status \"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5\": rpc error: code = NotFound desc = could not find container \"348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5\": container with ID starting with 348072a04a5b2834fbd8a89e53701d4a6f46857770508b7d6f38a69fa17544e5 not found: ID does not exist" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.012411 4681 scope.go:117] "RemoveContainer" containerID="7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70" Oct 07 17:07:13 crc kubenswrapper[4681]: E1007 17:07:13.012767 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70\": container with ID starting with 7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70 not found: ID does not exist" containerID="7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.012826 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70"} err="failed to get container status \"7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70\": rpc error: code = NotFound desc = could not find container \"7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70\": container with ID starting with 7274aa83f539abcd50a2add9ea5c2ae0e4ed5cd1ac950c0e387b4725e39f0e70 not found: ID does not exist" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.012877 4681 scope.go:117] "RemoveContainer" containerID="5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a" Oct 07 17:07:13 crc kubenswrapper[4681]: E1007 17:07:13.013345 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a\": container with ID starting with 5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a not found: ID does not exist" containerID="5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.013373 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a"} err="failed to get container status \"5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a\": rpc error: code = NotFound desc = could not find container \"5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a\": container with ID starting with 5253f09a46dfdeea12788e8d60c21085bf23c94bed75eaf1431119e185ac658a not found: ID does not exist" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.035120 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" path="/var/lib/kubelet/pods/29ae017e-1bbd-4cf3-bda3-5fd9a25866c7/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.035857 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390445e9-214f-423d-b39d-9411ca5cf099" path="/var/lib/kubelet/pods/390445e9-214f-423d-b39d-9411ca5cf099/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.036528 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61783842-70fb-40b1-bc57-f614ca527168" path="/var/lib/kubelet/pods/61783842-70fb-40b1-bc57-f614ca527168/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.037488 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" path="/var/lib/kubelet/pods/dbfe57d1-0360-4f50-b36c-cc80a36f868e/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.038113 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" path="/var/lib/kubelet/pods/eab9b71a-d59d-436b-8c0b-c62801ea9326/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.039124 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" path="/var/lib/kubelet/pods/eda3a2b0-8a17-40b1-b463-7b98159360db/volumes" Oct 07 17:07:13 crc kubenswrapper[4681]: I1007 17:07:13.713098 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kbm6c" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002587 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002774 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002785 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002795 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002800 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002811 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002816 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002826 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002832 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002840 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002850 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002858 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002864 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002880 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002890 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002922 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002931 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002940 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002946 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002956 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002962 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002972 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002978 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.002987 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.002994 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="extract-content" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.003003 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003009 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.003019 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003024 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.003031 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003036 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" Oct 07 17:07:14 crc kubenswrapper[4681]: E1007 17:07:14.003043 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003049 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="extract-utilities" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003137 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="61783842-70fb-40b1-bc57-f614ca527168" containerName="marketplace-operator" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003147 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfe57d1-0360-4f50-b36c-cc80a36f868e" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003157 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ae017e-1bbd-4cf3-bda3-5fd9a25866c7" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003168 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab9b71a-d59d-436b-8c0b-c62801ea9326" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003173 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="390445e9-214f-423d-b39d-9411ca5cf099" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003180 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda3a2b0-8a17-40b1-b463-7b98159360db" containerName="registry-server" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.003818 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.005709 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.013726 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.176501 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.176584 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9t4k\" (UniqueName: \"kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.176690 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.277705 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.277750 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.277783 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9t4k\" (UniqueName: \"kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.278219 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.278301 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.293159 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9t4k\" (UniqueName: \"kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k\") pod \"redhat-operators-vwnmb\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.320950 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.696336 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:07:14 crc kubenswrapper[4681]: I1007 17:07:14.718462 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerStarted","Data":"bd6d6b8eb1ef3916bfcc55470e09f3f36259f8e6fce075af42ab516e6b577f69"} Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.008906 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvq9k"] Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.010247 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.014317 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.015278 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvq9k"] Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.200217 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-catalog-content\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.200305 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-utilities\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.200437 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k422j\" (UniqueName: \"kubernetes.io/projected/fb7e45fe-c863-485b-a67b-133a94f0a533-kube-api-access-k422j\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.301809 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-utilities\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.301925 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k422j\" (UniqueName: \"kubernetes.io/projected/fb7e45fe-c863-485b-a67b-133a94f0a533-kube-api-access-k422j\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.301974 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-catalog-content\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.302341 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-utilities\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.302533 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7e45fe-c863-485b-a67b-133a94f0a533-catalog-content\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.322057 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k422j\" (UniqueName: \"kubernetes.io/projected/fb7e45fe-c863-485b-a67b-133a94f0a533-kube-api-access-k422j\") pod \"redhat-marketplace-jvq9k\" (UID: \"fb7e45fe-c863-485b-a67b-133a94f0a533\") " pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.329726 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.515021 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvq9k"] Oct 07 17:07:15 crc kubenswrapper[4681]: W1007 17:07:15.519180 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7e45fe_c863_485b_a67b_133a94f0a533.slice/crio-6494d3481daf81c5628f2e0c78313b8301f878e9f7cdd3e4b7b9d88412638b95 WatchSource:0}: Error finding container 6494d3481daf81c5628f2e0c78313b8301f878e9f7cdd3e4b7b9d88412638b95: Status 404 returned error can't find the container with id 6494d3481daf81c5628f2e0c78313b8301f878e9f7cdd3e4b7b9d88412638b95 Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.725526 4681 generic.go:334] "Generic (PLEG): container finished" podID="85386158-eea6-47d2-bd74-d43e0058715f" containerID="bc189d7f1f0fea5b840abb631dde86d33ca57a89d080e18db348b76100f174fe" exitCode=0 Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.725727 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerDied","Data":"bc189d7f1f0fea5b840abb631dde86d33ca57a89d080e18db348b76100f174fe"} Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.728500 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb7e45fe-c863-485b-a67b-133a94f0a533" containerID="09105418c0fc7f456cb17a779bd9e54a9c9eac1ea922f50fe377b1734f61f227" exitCode=0 Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.728529 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvq9k" event={"ID":"fb7e45fe-c863-485b-a67b-133a94f0a533","Type":"ContainerDied","Data":"09105418c0fc7f456cb17a779bd9e54a9c9eac1ea922f50fe377b1734f61f227"} Oct 07 17:07:15 crc kubenswrapper[4681]: I1007 17:07:15.728548 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvq9k" event={"ID":"fb7e45fe-c863-485b-a67b-133a94f0a533","Type":"ContainerStarted","Data":"6494d3481daf81c5628f2e0c78313b8301f878e9f7cdd3e4b7b9d88412638b95"} Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.411029 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ntpwx"] Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.413228 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.415845 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.416561 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-utilities\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.416608 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfpb\" (UniqueName: \"kubernetes.io/projected/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-kube-api-access-zbfpb\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.416671 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-catalog-content\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.422844 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntpwx"] Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.517655 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-catalog-content\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.517716 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-utilities\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.517795 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbfpb\" (UniqueName: \"kubernetes.io/projected/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-kube-api-access-zbfpb\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.518396 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-utilities\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.518555 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-catalog-content\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.538237 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbfpb\" (UniqueName: \"kubernetes.io/projected/b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee-kube-api-access-zbfpb\") pod \"community-operators-ntpwx\" (UID: \"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee\") " pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.735705 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerStarted","Data":"131b139777266861bc6ddc7e1ddb639f8b9bd81d0de708cedc933b014baa03b4"} Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.737548 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb7e45fe-c863-485b-a67b-133a94f0a533" containerID="3bd1407a094b59a861c1762c9f60bd6038a7bb2230bb3775b73e5a517b061de1" exitCode=0 Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.737574 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvq9k" event={"ID":"fb7e45fe-c863-485b-a67b-133a94f0a533","Type":"ContainerDied","Data":"3bd1407a094b59a861c1762c9f60bd6038a7bb2230bb3775b73e5a517b061de1"} Oct 07 17:07:16 crc kubenswrapper[4681]: I1007 17:07:16.782963 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.189507 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntpwx"] Oct 07 17:07:17 crc kubenswrapper[4681]: W1007 17:07:17.195341 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81cacc9_a147_4bfe_887d_b7bfbd0ca3ee.slice/crio-2abaaebf40ee797c2217853fc76e5fd9e9e7973eb66693cebed12851be22810e WatchSource:0}: Error finding container 2abaaebf40ee797c2217853fc76e5fd9e9e7973eb66693cebed12851be22810e: Status 404 returned error can't find the container with id 2abaaebf40ee797c2217853fc76e5fd9e9e7973eb66693cebed12851be22810e Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.404188 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.405403 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.409593 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.421536 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.535661 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.535746 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.535929 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz54x\" (UniqueName: \"kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.636482 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz54x\" (UniqueName: \"kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.636528 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.636561 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.637026 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.637054 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.659570 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz54x\" (UniqueName: \"kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x\") pod \"certified-operators-c95wr\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.723365 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.752923 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvq9k" event={"ID":"fb7e45fe-c863-485b-a67b-133a94f0a533","Type":"ContainerStarted","Data":"e65977f21793b3c1c88bed7043c0c502934d3752ddbbf91ad0080b8053d8d124"} Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.755514 4681 generic.go:334] "Generic (PLEG): container finished" podID="b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee" containerID="ba57479f426d2401432658b33ec82f2014402d289d9115119a4ad96362d347d1" exitCode=0 Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.755613 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntpwx" event={"ID":"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee","Type":"ContainerDied","Data":"ba57479f426d2401432658b33ec82f2014402d289d9115119a4ad96362d347d1"} Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.755664 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntpwx" event={"ID":"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee","Type":"ContainerStarted","Data":"2abaaebf40ee797c2217853fc76e5fd9e9e7973eb66693cebed12851be22810e"} Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.766251 4681 generic.go:334] "Generic (PLEG): container finished" podID="85386158-eea6-47d2-bd74-d43e0058715f" containerID="131b139777266861bc6ddc7e1ddb639f8b9bd81d0de708cedc933b014baa03b4" exitCode=0 Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.766319 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerDied","Data":"131b139777266861bc6ddc7e1ddb639f8b9bd81d0de708cedc933b014baa03b4"} Oct 07 17:07:17 crc kubenswrapper[4681]: I1007 17:07:17.774327 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvq9k" podStartSLOduration=2.276291732 podStartE2EDuration="3.774308327s" podCreationTimestamp="2025-10-07 17:07:14 +0000 UTC" firstStartedPulling="2025-10-07 17:07:15.729641676 +0000 UTC m=+239.377053231" lastFinishedPulling="2025-10-07 17:07:17.227658271 +0000 UTC m=+240.875069826" observedRunningTime="2025-10-07 17:07:17.771495246 +0000 UTC m=+241.418906801" watchObservedRunningTime="2025-10-07 17:07:17.774308327 +0000 UTC m=+241.421719892" Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.167929 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 17:07:18 crc kubenswrapper[4681]: W1007 17:07:18.179407 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2743c88_7c95_463b_b5d3_4d183dd1e3e1.slice/crio-8c9b6ff65d60cad1554e754a14a051a34e30a530ac447551bcd0f178c0d291be WatchSource:0}: Error finding container 8c9b6ff65d60cad1554e754a14a051a34e30a530ac447551bcd0f178c0d291be: Status 404 returned error can't find the container with id 8c9b6ff65d60cad1554e754a14a051a34e30a530ac447551bcd0f178c0d291be Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.772514 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerStarted","Data":"d2b30eb544c2f07955373dc86a1233ab05ac7b2025a12d3a0758be6d10f9d8ea"} Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.780426 4681 generic.go:334] "Generic (PLEG): container finished" podID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerID="5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3" exitCode=0 Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.781501 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerDied","Data":"5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3"} Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.781525 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerStarted","Data":"8c9b6ff65d60cad1554e754a14a051a34e30a530ac447551bcd0f178c0d291be"} Oct 07 17:07:18 crc kubenswrapper[4681]: I1007 17:07:18.790432 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwnmb" podStartSLOduration=3.345656644 podStartE2EDuration="5.790413227s" podCreationTimestamp="2025-10-07 17:07:13 +0000 UTC" firstStartedPulling="2025-10-07 17:07:15.727009699 +0000 UTC m=+239.374421254" lastFinishedPulling="2025-10-07 17:07:18.171766282 +0000 UTC m=+241.819177837" observedRunningTime="2025-10-07 17:07:18.789793279 +0000 UTC m=+242.437204834" watchObservedRunningTime="2025-10-07 17:07:18.790413227 +0000 UTC m=+242.437824782" Oct 07 17:07:19 crc kubenswrapper[4681]: I1007 17:07:19.787384 4681 generic.go:334] "Generic (PLEG): container finished" podID="b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee" containerID="5151d816f25203f6130fcd51be3364b54fcc4977009207012b673704cb78332c" exitCode=0 Oct 07 17:07:19 crc kubenswrapper[4681]: I1007 17:07:19.789446 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntpwx" event={"ID":"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee","Type":"ContainerDied","Data":"5151d816f25203f6130fcd51be3364b54fcc4977009207012b673704cb78332c"} Oct 07 17:07:20 crc kubenswrapper[4681]: I1007 17:07:20.796056 4681 generic.go:334] "Generic (PLEG): container finished" podID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerID="9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425" exitCode=0 Oct 07 17:07:20 crc kubenswrapper[4681]: I1007 17:07:20.796132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerDied","Data":"9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425"} Oct 07 17:07:20 crc kubenswrapper[4681]: I1007 17:07:20.822506 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntpwx" event={"ID":"b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee","Type":"ContainerStarted","Data":"15641db3507e111f51442548366e1ea3dec8bbe302594ef02f507f35eec386a1"} Oct 07 17:07:20 crc kubenswrapper[4681]: I1007 17:07:20.849505 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ntpwx" podStartSLOduration=2.410344797 podStartE2EDuration="4.849486767s" podCreationTimestamp="2025-10-07 17:07:16 +0000 UTC" firstStartedPulling="2025-10-07 17:07:17.758077796 +0000 UTC m=+241.405489351" lastFinishedPulling="2025-10-07 17:07:20.197219766 +0000 UTC m=+243.844631321" observedRunningTime="2025-10-07 17:07:20.843671878 +0000 UTC m=+244.491083423" watchObservedRunningTime="2025-10-07 17:07:20.849486767 +0000 UTC m=+244.496898322" Oct 07 17:07:21 crc kubenswrapper[4681]: I1007 17:07:21.831174 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerStarted","Data":"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77"} Oct 07 17:07:21 crc kubenswrapper[4681]: I1007 17:07:21.848396 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c95wr" podStartSLOduration=2.435450517 podStartE2EDuration="4.848376097s" podCreationTimestamp="2025-10-07 17:07:17 +0000 UTC" firstStartedPulling="2025-10-07 17:07:18.781967652 +0000 UTC m=+242.429379207" lastFinishedPulling="2025-10-07 17:07:21.194893232 +0000 UTC m=+244.842304787" observedRunningTime="2025-10-07 17:07:21.846948765 +0000 UTC m=+245.494360330" watchObservedRunningTime="2025-10-07 17:07:21.848376097 +0000 UTC m=+245.495787652" Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.007082 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ltnkf" Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.084408 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.321524 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.321738 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.371823 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:24 crc kubenswrapper[4681]: I1007 17:07:24.886429 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:07:25 crc kubenswrapper[4681]: I1007 17:07:25.330189 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:25 crc kubenswrapper[4681]: I1007 17:07:25.330689 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:25 crc kubenswrapper[4681]: I1007 17:07:25.368540 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:25 crc kubenswrapper[4681]: I1007 17:07:25.889461 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvq9k" Oct 07 17:07:26 crc kubenswrapper[4681]: I1007 17:07:26.783224 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:26 crc kubenswrapper[4681]: I1007 17:07:26.783277 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:26 crc kubenswrapper[4681]: I1007 17:07:26.830102 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:26 crc kubenswrapper[4681]: I1007 17:07:26.890588 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ntpwx" Oct 07 17:07:27 crc kubenswrapper[4681]: I1007 17:07:27.723812 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:27 crc kubenswrapper[4681]: I1007 17:07:27.723866 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:27 crc kubenswrapper[4681]: I1007 17:07:27.766131 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:27 crc kubenswrapper[4681]: I1007 17:07:27.903225 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.092463 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" containerID="cri-o://05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04" gracePeriod=15 Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.476213 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.512572 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv"] Oct 07 17:07:34 crc kubenswrapper[4681]: E1007 17:07:34.512764 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.512774 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.512873 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerName="oauth-openshift" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.513354 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.522696 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv"] Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546694 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546748 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546787 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546812 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546840 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546907 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546947 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.546980 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547003 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547030 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547050 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547081 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxdjv\" (UniqueName: \"kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547133 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.547159 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig\") pod \"c2c64f34-b460-412c-b82e-2dbc6c93444e\" (UID: \"c2c64f34-b460-412c-b82e-2dbc6c93444e\") " Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.548268 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.548309 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.549795 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.549846 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.550037 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.562685 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.562706 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv" (OuterVolumeSpecName: "kube-api-access-mxdjv") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "kube-api-access-mxdjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.563220 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.563310 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.567053 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.568926 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.569178 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.569256 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.569678 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c2c64f34-b460-412c-b82e-2dbc6c93444e" (UID: "c2c64f34-b460-412c-b82e-2dbc6c93444e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648413 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648473 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648496 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648516 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648534 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648553 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-policies\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648569 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-dir\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648584 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648601 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648647 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648673 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648696 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648712 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks687\" (UniqueName: \"kubernetes.io/projected/754ab85a-53bc-4d6a-b638-28f61b23f76f-kube-api-access-ks687\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648742 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648777 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648788 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648798 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648808 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648819 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648828 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648838 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648847 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648857 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648866 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648892 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648902 4681 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2c64f34-b460-412c-b82e-2dbc6c93444e-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648910 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2c64f34-b460-412c-b82e-2dbc6c93444e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.648920 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxdjv\" (UniqueName: \"kubernetes.io/projected/c2c64f34-b460-412c-b82e-2dbc6c93444e-kube-api-access-mxdjv\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750060 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750131 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750159 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750183 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks687\" (UniqueName: \"kubernetes.io/projected/754ab85a-53bc-4d6a-b638-28f61b23f76f-kube-api-access-ks687\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750223 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750252 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750312 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750361 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750382 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750406 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-policies\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750428 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750448 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-dir\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.750468 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.751194 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.751175 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.751393 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.751726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-policies\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.753718 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.753785 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/754ab85a-53bc-4d6a-b638-28f61b23f76f-audit-dir\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.757527 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-login\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.759027 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-template-error\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.759564 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-session\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.760288 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.765236 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.766413 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.767847 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/754ab85a-53bc-4d6a-b638-28f61b23f76f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.771414 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks687\" (UniqueName: \"kubernetes.io/projected/754ab85a-53bc-4d6a-b638-28f61b23f76f-kube-api-access-ks687\") pod \"oauth-openshift-6775b6d8cc-nd6tv\" (UID: \"754ab85a-53bc-4d6a-b638-28f61b23f76f\") " pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.845936 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.891291 4681 generic.go:334] "Generic (PLEG): container finished" podID="c2c64f34-b460-412c-b82e-2dbc6c93444e" containerID="05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04" exitCode=0 Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.891336 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" event={"ID":"c2c64f34-b460-412c-b82e-2dbc6c93444e","Type":"ContainerDied","Data":"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04"} Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.891364 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" event={"ID":"c2c64f34-b460-412c-b82e-2dbc6c93444e","Type":"ContainerDied","Data":"92e0324ea206585f6cd3112c710ea9922079e3c3b633afd3dcd230d8db41721e"} Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.891382 4681 scope.go:117] "RemoveContainer" containerID="05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.891491 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tw9ww" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.940944 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.942469 4681 scope.go:117] "RemoveContainer" containerID="05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04" Oct 07 17:07:34 crc kubenswrapper[4681]: E1007 17:07:34.944409 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04\": container with ID starting with 05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04 not found: ID does not exist" containerID="05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.944539 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04"} err="failed to get container status \"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04\": rpc error: code = NotFound desc = could not find container \"05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04\": container with ID starting with 05cc4b99840a9e8dfd1f03e3ae935138c490472a827117dcfa16ccd394711d04 not found: ID does not exist" Oct 07 17:07:34 crc kubenswrapper[4681]: I1007 17:07:34.949588 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tw9ww"] Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.036103 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c64f34-b460-412c-b82e-2dbc6c93444e" path="/var/lib/kubelet/pods/c2c64f34-b460-412c-b82e-2dbc6c93444e/volumes" Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.266090 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv"] Oct 07 17:07:35 crc kubenswrapper[4681]: W1007 17:07:35.274707 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod754ab85a_53bc_4d6a_b638_28f61b23f76f.slice/crio-104ba8be0d45cc31df654495878b042eb6ebaaf0776c8306711adc863839104d WatchSource:0}: Error finding container 104ba8be0d45cc31df654495878b042eb6ebaaf0776c8306711adc863839104d: Status 404 returned error can't find the container with id 104ba8be0d45cc31df654495878b042eb6ebaaf0776c8306711adc863839104d Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.901278 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" event={"ID":"754ab85a-53bc-4d6a-b638-28f61b23f76f","Type":"ContainerStarted","Data":"f21e73e70b188ce1ebf89a3beed6ff64a010f0463a7a1376f6dc7e2264c738f1"} Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.901664 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" event={"ID":"754ab85a-53bc-4d6a-b638-28f61b23f76f","Type":"ContainerStarted","Data":"104ba8be0d45cc31df654495878b042eb6ebaaf0776c8306711adc863839104d"} Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.903036 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:35 crc kubenswrapper[4681]: I1007 17:07:35.925757 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" podStartSLOduration=26.92573049 podStartE2EDuration="26.92573049s" podCreationTimestamp="2025-10-07 17:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:07:35.922211558 +0000 UTC m=+259.569623143" watchObservedRunningTime="2025-10-07 17:07:35.92573049 +0000 UTC m=+259.573142075" Oct 07 17:07:36 crc kubenswrapper[4681]: I1007 17:07:36.177017 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6775b6d8cc-nd6tv" Oct 07 17:07:49 crc kubenswrapper[4681]: I1007 17:07:49.878672 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" podUID="66e9eba2-1514-42a7-b14b-802c380cc3b3" containerName="registry" containerID="cri-o://f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69" gracePeriod=30 Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.277379 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.361754 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.361802 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnmtf\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.361832 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.361857 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.361933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.362739 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.362806 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.362859 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.363035 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.363116 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted\") pod \"66e9eba2-1514-42a7-b14b-802c380cc3b3\" (UID: \"66e9eba2-1514-42a7-b14b-802c380cc3b3\") " Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.363390 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.363418 4681 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.370558 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.371094 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf" (OuterVolumeSpecName: "kube-api-access-qnmtf") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "kube-api-access-qnmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.371293 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.371648 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.375225 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.382369 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "66e9eba2-1514-42a7-b14b-802c380cc3b3" (UID: "66e9eba2-1514-42a7-b14b-802c380cc3b3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.464591 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnmtf\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-kube-api-access-qnmtf\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.464634 4681 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66e9eba2-1514-42a7-b14b-802c380cc3b3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.464644 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.464655 4681 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66e9eba2-1514-42a7-b14b-802c380cc3b3-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:50 crc kubenswrapper[4681]: I1007 17:07:50.464667 4681 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66e9eba2-1514-42a7-b14b-802c380cc3b3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.002487 4681 generic.go:334] "Generic (PLEG): container finished" podID="66e9eba2-1514-42a7-b14b-802c380cc3b3" containerID="f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69" exitCode=0 Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.002532 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" event={"ID":"66e9eba2-1514-42a7-b14b-802c380cc3b3","Type":"ContainerDied","Data":"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69"} Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.002553 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.002568 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vr5kp" event={"ID":"66e9eba2-1514-42a7-b14b-802c380cc3b3","Type":"ContainerDied","Data":"09309c0e5ea1d03a85c2e2ad6a645e3c4c0523c02dc0e1fdd31ea56f70c8eaeb"} Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.002586 4681 scope.go:117] "RemoveContainer" containerID="f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69" Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.020183 4681 scope.go:117] "RemoveContainer" containerID="f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69" Oct 07 17:07:51 crc kubenswrapper[4681]: E1007 17:07:51.021008 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69\": container with ID starting with f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69 not found: ID does not exist" containerID="f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69" Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.021049 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69"} err="failed to get container status \"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69\": rpc error: code = NotFound desc = could not find container \"f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69\": container with ID starting with f5625d0927459b5d1ad431f5237259c9f986172ac2163bd135bb35baeafacf69 not found: ID does not exist" Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.037608 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:07:51 crc kubenswrapper[4681]: I1007 17:07:51.037645 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vr5kp"] Oct 07 17:07:53 crc kubenswrapper[4681]: I1007 17:07:53.042774 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e9eba2-1514-42a7-b14b-802c380cc3b3" path="/var/lib/kubelet/pods/66e9eba2-1514-42a7-b14b-802c380cc3b3/volumes" Oct 07 17:08:42 crc kubenswrapper[4681]: I1007 17:08:42.195303 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:08:42 crc kubenswrapper[4681]: I1007 17:08:42.195768 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:09:12 crc kubenswrapper[4681]: I1007 17:09:12.195210 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:09:12 crc kubenswrapper[4681]: I1007 17:09:12.196004 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.194979 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.195492 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.195537 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.196080 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.196131 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422" gracePeriod=600 Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.607525 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422" exitCode=0 Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.607630 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422"} Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.607905 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede"} Oct 07 17:09:42 crc kubenswrapper[4681]: I1007 17:09:42.607936 4681 scope.go:117] "RemoveContainer" containerID="239495e11d5c0e245c57ecd965d11ee0e31b8e530a75b51e8bcbb2b678b5987c" Oct 07 17:11:42 crc kubenswrapper[4681]: I1007 17:11:42.195982 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:11:42 crc kubenswrapper[4681]: I1007 17:11:42.198595 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:12:12 crc kubenswrapper[4681]: I1007 17:12:12.195392 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:12:12 crc kubenswrapper[4681]: I1007 17:12:12.195865 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.195206 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.195827 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.195929 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.196561 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.196618 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede" gracePeriod=600 Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.582527 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede" exitCode=0 Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.582576 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede"} Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.582991 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b"} Oct 07 17:12:42 crc kubenswrapper[4681]: I1007 17:12:42.583017 4681 scope.go:117] "RemoveContainer" containerID="39663f4fcfd152dc8dc829b17c20ffbb5fc718910f72b7fe94058ef2d7e4c422" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.336359 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wvq66"] Oct 07 17:14:22 crc kubenswrapper[4681]: E1007 17:14:22.337207 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e9eba2-1514-42a7-b14b-802c380cc3b3" containerName="registry" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.337225 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e9eba2-1514-42a7-b14b-802c380cc3b3" containerName="registry" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.337342 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e9eba2-1514-42a7-b14b-802c380cc3b3" containerName="registry" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.337812 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.341302 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.341541 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.341986 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mfpd7" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.349335 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5qxkp"] Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.350178 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-5qxkp" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.358396 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9jxfr" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.358721 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wvq66"] Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.366081 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5qxkp"] Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.389900 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-djw5h"] Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.390655 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.394435 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-f8jnk" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.413178 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldlkk\" (UniqueName: \"kubernetes.io/projected/b603cb8d-41a5-4537-95da-d2e4fa39ce75-kube-api-access-ldlkk\") pod \"cert-manager-cainjector-7f985d654d-wvq66\" (UID: \"b603cb8d-41a5-4537-95da-d2e4fa39ce75\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.447722 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-djw5h"] Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.514278 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldlkk\" (UniqueName: \"kubernetes.io/projected/b603cb8d-41a5-4537-95da-d2e4fa39ce75-kube-api-access-ldlkk\") pod \"cert-manager-cainjector-7f985d654d-wvq66\" (UID: \"b603cb8d-41a5-4537-95da-d2e4fa39ce75\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.514358 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6ff\" (UniqueName: \"kubernetes.io/projected/a8a15de5-2d99-41a6-b4c9-7d31c28413b2-kube-api-access-xw6ff\") pod \"cert-manager-webhook-5655c58dd6-djw5h\" (UID: \"a8a15de5-2d99-41a6-b4c9-7d31c28413b2\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.514390 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55hh\" (UniqueName: \"kubernetes.io/projected/e132d85f-c498-41eb-a780-be92455331bb-kube-api-access-k55hh\") pod \"cert-manager-5b446d88c5-5qxkp\" (UID: \"e132d85f-c498-41eb-a780-be92455331bb\") " pod="cert-manager/cert-manager-5b446d88c5-5qxkp" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.550596 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldlkk\" (UniqueName: \"kubernetes.io/projected/b603cb8d-41a5-4537-95da-d2e4fa39ce75-kube-api-access-ldlkk\") pod \"cert-manager-cainjector-7f985d654d-wvq66\" (UID: \"b603cb8d-41a5-4537-95da-d2e4fa39ce75\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.615147 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6ff\" (UniqueName: \"kubernetes.io/projected/a8a15de5-2d99-41a6-b4c9-7d31c28413b2-kube-api-access-xw6ff\") pod \"cert-manager-webhook-5655c58dd6-djw5h\" (UID: \"a8a15de5-2d99-41a6-b4c9-7d31c28413b2\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.615207 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55hh\" (UniqueName: \"kubernetes.io/projected/e132d85f-c498-41eb-a780-be92455331bb-kube-api-access-k55hh\") pod \"cert-manager-5b446d88c5-5qxkp\" (UID: \"e132d85f-c498-41eb-a780-be92455331bb\") " pod="cert-manager/cert-manager-5b446d88c5-5qxkp" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.633577 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55hh\" (UniqueName: \"kubernetes.io/projected/e132d85f-c498-41eb-a780-be92455331bb-kube-api-access-k55hh\") pod \"cert-manager-5b446d88c5-5qxkp\" (UID: \"e132d85f-c498-41eb-a780-be92455331bb\") " pod="cert-manager/cert-manager-5b446d88c5-5qxkp" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.637034 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6ff\" (UniqueName: \"kubernetes.io/projected/a8a15de5-2d99-41a6-b4c9-7d31c28413b2-kube-api-access-xw6ff\") pod \"cert-manager-webhook-5655c58dd6-djw5h\" (UID: \"a8a15de5-2d99-41a6-b4c9-7d31c28413b2\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.661756 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.673431 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-5qxkp" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.706474 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.950996 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wvq66"] Oct 07 17:14:22 crc kubenswrapper[4681]: W1007 17:14:22.956251 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb603cb8d_41a5_4537_95da_d2e4fa39ce75.slice/crio-18ce100bb930791d79952b37c1a1a0f0f93b512c6d47433ccbda874b75842d05 WatchSource:0}: Error finding container 18ce100bb930791d79952b37c1a1a0f0f93b512c6d47433ccbda874b75842d05: Status 404 returned error can't find the container with id 18ce100bb930791d79952b37c1a1a0f0f93b512c6d47433ccbda874b75842d05 Oct 07 17:14:22 crc kubenswrapper[4681]: I1007 17:14:22.959471 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:14:23 crc kubenswrapper[4681]: I1007 17:14:23.002433 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-djw5h"] Oct 07 17:14:23 crc kubenswrapper[4681]: I1007 17:14:23.081426 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" event={"ID":"a8a15de5-2d99-41a6-b4c9-7d31c28413b2","Type":"ContainerStarted","Data":"79e1c875e48fb2d618d2504bd405707f16b3c1d31068b791f4dd5e4a65999e06"} Oct 07 17:14:23 crc kubenswrapper[4681]: I1007 17:14:23.082062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" event={"ID":"b603cb8d-41a5-4537-95da-d2e4fa39ce75","Type":"ContainerStarted","Data":"18ce100bb930791d79952b37c1a1a0f0f93b512c6d47433ccbda874b75842d05"} Oct 07 17:14:23 crc kubenswrapper[4681]: I1007 17:14:23.230840 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-5qxkp"] Oct 07 17:14:23 crc kubenswrapper[4681]: W1007 17:14:23.236355 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode132d85f_c498_41eb_a780_be92455331bb.slice/crio-a8a80667b501755400f89c0d84c94f6890f2ba35cecef05200ccdd60f5b03ad3 WatchSource:0}: Error finding container a8a80667b501755400f89c0d84c94f6890f2ba35cecef05200ccdd60f5b03ad3: Status 404 returned error can't find the container with id a8a80667b501755400f89c0d84c94f6890f2ba35cecef05200ccdd60f5b03ad3 Oct 07 17:14:24 crc kubenswrapper[4681]: I1007 17:14:24.088613 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-5qxkp" event={"ID":"e132d85f-c498-41eb-a780-be92455331bb","Type":"ContainerStarted","Data":"a8a80667b501755400f89c0d84c94f6890f2ba35cecef05200ccdd60f5b03ad3"} Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.065357 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d6lkl"] Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.070702 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-controller" containerID="cri-o://42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071106 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="sbdb" containerID="cri-o://8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071147 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="nbdb" containerID="cri-o://ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071182 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="northd" containerID="cri-o://fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071212 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071241 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-node" containerID="cri-o://1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.071274 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-acl-logging" containerID="cri-o://7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.124052 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" containerID="cri-o://3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" gracePeriod=30 Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.773009 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/3.log" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.776321 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovn-acl-logging/0.log" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.777657 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovn-controller/0.log" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.778167 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.833727 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lrg2s"] Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834202 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834227 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834241 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="nbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834252 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="nbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834298 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834310 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834321 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834330 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834368 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kubecfg-setup" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834380 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kubecfg-setup" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834393 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834401 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834411 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834419 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834461 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-acl-logging" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834469 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-acl-logging" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834481 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834490 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834498 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="sbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834530 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="sbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834546 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-node" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834558 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-node" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.834569 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="northd" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834577 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="northd" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834811 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834829 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834860 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="nbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834906 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovn-acl-logging" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834916 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834925 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834932 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834944 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-node" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834956 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="sbdb" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.834989 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="northd" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.835000 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 17:14:38 crc kubenswrapper[4681]: E1007 17:14:38.836071 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.836125 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.836251 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerName="ovnkube-controller" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.839770 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936411 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936473 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936509 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936544 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936536 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936575 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936592 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936601 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936606 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936629 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936633 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936656 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936678 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936706 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936725 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936755 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936776 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936791 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936656 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash" (OuterVolumeSpecName: "host-slash") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936674 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936700 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936815 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936833 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwz2j\" (UniqueName: \"kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936864 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936912 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936721 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936930 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936961 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes\") pod \"615b8d72-0ec5-42d0-966e-db1c2b787962\" (UID: \"615b8d72-0ec5-42d0-966e-db1c2b787962\") " Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936833 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936856 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log" (OuterVolumeSpecName: "node-log") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937123 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937152 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-etc-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937167 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-netd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937186 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-env-overrides\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937204 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-systemd-units\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936955 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket" (OuterVolumeSpecName: "log-socket") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.936980 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937048 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937094 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937234 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-netns\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937205 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937303 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937566 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-systemd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937697 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-var-lib-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937720 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937726 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-config\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937801 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-script-lib\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsffd\" (UniqueName: \"kubernetes.io/projected/685f728e-7857-4230-88b3-fcf23bcab630-kube-api-access-nsffd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937864 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-ovn\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.937931 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938040 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/685f728e-7857-4230-88b3-fcf23bcab630-ovn-node-metrics-cert\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938203 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-node-log\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938229 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-bin\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938287 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-slash\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938344 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-log-socket\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938381 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-kubelet\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938504 4681 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938520 4681 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938532 4681 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938542 4681 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938553 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938563 4681 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938572 4681 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938581 4681 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938594 4681 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938608 4681 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938621 4681 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938630 4681 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938642 4681 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938655 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938668 4681 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938680 4681 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.938693 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/615b8d72-0ec5-42d0-966e-db1c2b787962-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.944517 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j" (OuterVolumeSpecName: "kube-api-access-lwz2j") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "kube-api-access-lwz2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.944629 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:14:38 crc kubenswrapper[4681]: I1007 17:14:38.951866 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "615b8d72-0ec5-42d0-966e-db1c2b787962" (UID: "615b8d72-0ec5-42d0-966e-db1c2b787962"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039673 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039715 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/685f728e-7857-4230-88b3-fcf23bcab630-ovn-node-metrics-cert\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039740 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-node-log\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039754 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-bin\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039771 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-slash\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039791 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-log-socket\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039807 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-kubelet\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039822 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039840 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039860 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-etc-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039874 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-netd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-env-overrides\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039932 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-systemd-units\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039954 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-netns\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039974 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-systemd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.039993 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-var-lib-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-config\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040022 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-script-lib\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040040 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsffd\" (UniqueName: \"kubernetes.io/projected/685f728e-7857-4230-88b3-fcf23bcab630-kube-api-access-nsffd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040065 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-ovn\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040125 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwz2j\" (UniqueName: \"kubernetes.io/projected/615b8d72-0ec5-42d0-966e-db1c2b787962-kube-api-access-lwz2j\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040137 4681 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/615b8d72-0ec5-42d0-966e-db1c2b787962-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040146 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/615b8d72-0ec5-42d0-966e-db1c2b787962-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040186 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-ovn\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040218 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040672 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-netd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040723 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-node-log\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040757 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-cni-bin\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040805 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-slash\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040825 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-log-socket\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040844 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-kubelet\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040864 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-ovn-kubernetes\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040906 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040932 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-etc-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.040953 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-run-systemd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.041392 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-env-overrides\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.041433 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-systemd-units\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.041453 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-host-run-netns\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.041828 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-config\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.041871 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/685f728e-7857-4230-88b3-fcf23bcab630-var-lib-openvswitch\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.042322 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/685f728e-7857-4230-88b3-fcf23bcab630-ovnkube-script-lib\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.045367 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/685f728e-7857-4230-88b3-fcf23bcab630-ovn-node-metrics-cert\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.067014 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsffd\" (UniqueName: \"kubernetes.io/projected/685f728e-7857-4230-88b3-fcf23bcab630-kube-api-access-nsffd\") pod \"ovnkube-node-lrg2s\" (UID: \"685f728e-7857-4230-88b3-fcf23bcab630\") " pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.177010 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.189474 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-5qxkp" event={"ID":"e132d85f-c498-41eb-a780-be92455331bb","Type":"ContainerStarted","Data":"bbcf7624c3fdd98acd9f307ebe1a7660b4327aa94e3cb1d8adbda65f4617d920"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.191129 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/2.log" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.191709 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/1.log" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.191832 4681 generic.go:334] "Generic (PLEG): container finished" podID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" containerID="91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09" exitCode=2 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.191955 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerDied","Data":"91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.192014 4681 scope.go:117] "RemoveContainer" containerID="bc452c09c8f7b7c7c78ba1ca48d06b861e7f647975cf88452a4426686d360817" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.192635 4681 scope.go:117] "RemoveContainer" containerID="91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09" Oct 07 17:14:39 crc kubenswrapper[4681]: E1007 17:14:39.192797 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bt6z6_openshift-multus(78a1d2b3-3c0e-49f1-877c-db4f34d3154b)\"" pod="openshift-multus/multus-bt6z6" podUID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.199505 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovnkube-controller/3.log" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.201533 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovn-acl-logging/0.log" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.201982 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6lkl_615b8d72-0ec5-42d0-966e-db1c2b787962/ovn-controller/0.log" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202325 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202353 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202364 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202374 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202381 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202390 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" exitCode=0 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202398 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" exitCode=143 Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202408 4681 generic.go:334] "Generic (PLEG): container finished" podID="615b8d72-0ec5-42d0-966e-db1c2b787962" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" exitCode=143 Oct 07 17:14:39 crc kubenswrapper[4681]: W1007 17:14:39.202335 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod685f728e_7857_4230_88b3_fcf23bcab630.slice/crio-19567ac44749adbe64bf9da9dfcd1da3e66b6268299692544fc6471cbdcde36a WatchSource:0}: Error finding container 19567ac44749adbe64bf9da9dfcd1da3e66b6268299692544fc6471cbdcde36a: Status 404 returned error can't find the container with id 19567ac44749adbe64bf9da9dfcd1da3e66b6268299692544fc6471cbdcde36a Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202433 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202438 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202469 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202484 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202498 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202511 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202524 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202537 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202549 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202556 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202562 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202568 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202573 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202579 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202585 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202591 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202598 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202619 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202627 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202634 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202639 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202644 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202650 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202655 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202660 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202666 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202671 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202678 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202687 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202709 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202714 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202719 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202726 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202731 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202736 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202741 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202746 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202751 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202757 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6lkl" event={"ID":"615b8d72-0ec5-42d0-966e-db1c2b787962","Type":"ContainerDied","Data":"b720ea623dfc5e1a465a899aeb2994c7b62aeaa0357c29f74f906e2e42f9f10e"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202765 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202771 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202776 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202781 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202786 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202792 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202797 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202802 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202807 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.202812 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.214856 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-5qxkp" podStartSLOduration=2.436837326 podStartE2EDuration="17.214838466s" podCreationTimestamp="2025-10-07 17:14:22 +0000 UTC" firstStartedPulling="2025-10-07 17:14:23.240192704 +0000 UTC m=+666.887604259" lastFinishedPulling="2025-10-07 17:14:38.018193844 +0000 UTC m=+681.665605399" observedRunningTime="2025-10-07 17:14:39.214342162 +0000 UTC m=+682.861753717" watchObservedRunningTime="2025-10-07 17:14:39.214838466 +0000 UTC m=+682.862250011" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.271915 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d6lkl"] Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.275015 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d6lkl"] Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.958268 4681 scope.go:117] "RemoveContainer" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.972048 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:39 crc kubenswrapper[4681]: I1007 17:14:39.990118 4681 scope.go:117] "RemoveContainer" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.002551 4681 scope.go:117] "RemoveContainer" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.015129 4681 scope.go:117] "RemoveContainer" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.025168 4681 scope.go:117] "RemoveContainer" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.034706 4681 scope.go:117] "RemoveContainer" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.048196 4681 scope.go:117] "RemoveContainer" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.065353 4681 scope.go:117] "RemoveContainer" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.113584 4681 scope.go:117] "RemoveContainer" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.133572 4681 scope.go:117] "RemoveContainer" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.133959 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": container with ID starting with 3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a not found: ID does not exist" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.133995 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} err="failed to get container status \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": rpc error: code = NotFound desc = could not find container \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": container with ID starting with 3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134016 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.134272 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": container with ID starting with c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5 not found: ID does not exist" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134297 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} err="failed to get container status \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": rpc error: code = NotFound desc = could not find container \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": container with ID starting with c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134319 4681 scope.go:117] "RemoveContainer" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.134602 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": container with ID starting with 8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353 not found: ID does not exist" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134624 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} err="failed to get container status \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": rpc error: code = NotFound desc = could not find container \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": container with ID starting with 8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134638 4681 scope.go:117] "RemoveContainer" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.134911 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": container with ID starting with ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b not found: ID does not exist" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134945 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} err="failed to get container status \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": rpc error: code = NotFound desc = could not find container \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": container with ID starting with ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.134961 4681 scope.go:117] "RemoveContainer" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.135255 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": container with ID starting with fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0 not found: ID does not exist" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.135281 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} err="failed to get container status \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": rpc error: code = NotFound desc = could not find container \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": container with ID starting with fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.135300 4681 scope.go:117] "RemoveContainer" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.135597 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": container with ID starting with ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0 not found: ID does not exist" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.135624 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} err="failed to get container status \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": rpc error: code = NotFound desc = could not find container \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": container with ID starting with ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.135642 4681 scope.go:117] "RemoveContainer" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.135992 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": container with ID starting with 1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa not found: ID does not exist" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136017 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} err="failed to get container status \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": rpc error: code = NotFound desc = could not find container \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": container with ID starting with 1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136033 4681 scope.go:117] "RemoveContainer" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.136317 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": container with ID starting with 7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a not found: ID does not exist" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136344 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} err="failed to get container status \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": rpc error: code = NotFound desc = could not find container \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": container with ID starting with 7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136360 4681 scope.go:117] "RemoveContainer" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.136571 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": container with ID starting with 42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139 not found: ID does not exist" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136594 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} err="failed to get container status \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": rpc error: code = NotFound desc = could not find container \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": container with ID starting with 42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.136610 4681 scope.go:117] "RemoveContainer" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: E1007 17:14:40.137006 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": container with ID starting with b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0 not found: ID does not exist" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137038 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} err="failed to get container status \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": rpc error: code = NotFound desc = could not find container \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": container with ID starting with b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137056 4681 scope.go:117] "RemoveContainer" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137295 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} err="failed to get container status \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": rpc error: code = NotFound desc = could not find container \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": container with ID starting with 3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137320 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137527 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} err="failed to get container status \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": rpc error: code = NotFound desc = could not find container \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": container with ID starting with c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137547 4681 scope.go:117] "RemoveContainer" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137791 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} err="failed to get container status \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": rpc error: code = NotFound desc = could not find container \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": container with ID starting with 8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.137814 4681 scope.go:117] "RemoveContainer" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138063 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} err="failed to get container status \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": rpc error: code = NotFound desc = could not find container \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": container with ID starting with ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138080 4681 scope.go:117] "RemoveContainer" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138325 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} err="failed to get container status \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": rpc error: code = NotFound desc = could not find container \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": container with ID starting with fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138347 4681 scope.go:117] "RemoveContainer" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138530 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} err="failed to get container status \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": rpc error: code = NotFound desc = could not find container \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": container with ID starting with ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138557 4681 scope.go:117] "RemoveContainer" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138872 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} err="failed to get container status \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": rpc error: code = NotFound desc = could not find container \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": container with ID starting with 1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.138957 4681 scope.go:117] "RemoveContainer" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139254 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} err="failed to get container status \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": rpc error: code = NotFound desc = could not find container \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": container with ID starting with 7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139279 4681 scope.go:117] "RemoveContainer" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139532 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} err="failed to get container status \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": rpc error: code = NotFound desc = could not find container \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": container with ID starting with 42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139559 4681 scope.go:117] "RemoveContainer" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139788 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} err="failed to get container status \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": rpc error: code = NotFound desc = could not find container \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": container with ID starting with b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.139811 4681 scope.go:117] "RemoveContainer" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.140068 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} err="failed to get container status \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": rpc error: code = NotFound desc = could not find container \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": container with ID starting with 3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.140103 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.140374 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} err="failed to get container status \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": rpc error: code = NotFound desc = could not find container \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": container with ID starting with c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.140395 4681 scope.go:117] "RemoveContainer" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141037 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} err="failed to get container status \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": rpc error: code = NotFound desc = could not find container \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": container with ID starting with 8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141088 4681 scope.go:117] "RemoveContainer" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141394 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} err="failed to get container status \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": rpc error: code = NotFound desc = could not find container \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": container with ID starting with ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141421 4681 scope.go:117] "RemoveContainer" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141672 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} err="failed to get container status \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": rpc error: code = NotFound desc = could not find container \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": container with ID starting with fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141700 4681 scope.go:117] "RemoveContainer" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.141985 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} err="failed to get container status \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": rpc error: code = NotFound desc = could not find container \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": container with ID starting with ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.142006 4681 scope.go:117] "RemoveContainer" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.142497 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} err="failed to get container status \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": rpc error: code = NotFound desc = could not find container \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": container with ID starting with 1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.142524 4681 scope.go:117] "RemoveContainer" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.142831 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} err="failed to get container status \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": rpc error: code = NotFound desc = could not find container \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": container with ID starting with 7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.142857 4681 scope.go:117] "RemoveContainer" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143264 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} err="failed to get container status \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": rpc error: code = NotFound desc = could not find container \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": container with ID starting with 42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143289 4681 scope.go:117] "RemoveContainer" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143596 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} err="failed to get container status \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": rpc error: code = NotFound desc = could not find container \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": container with ID starting with b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143652 4681 scope.go:117] "RemoveContainer" containerID="3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143912 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a"} err="failed to get container status \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": rpc error: code = NotFound desc = could not find container \"3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a\": container with ID starting with 3d1ac1c14932dd447b74e0319037727370a26f23cb4971e9821c844e5d3f997a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.143933 4681 scope.go:117] "RemoveContainer" containerID="c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144226 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5"} err="failed to get container status \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": rpc error: code = NotFound desc = could not find container \"c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5\": container with ID starting with c0748b3f1e13507c2e70e5f09b7c00e07880439478c2153c726b8a59312e25a5 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144253 4681 scope.go:117] "RemoveContainer" containerID="8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144537 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353"} err="failed to get container status \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": rpc error: code = NotFound desc = could not find container \"8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353\": container with ID starting with 8dce89da4191da810008d35418b5a901c8f21954b6e9e02dbb56230da7b20353 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144567 4681 scope.go:117] "RemoveContainer" containerID="ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144851 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b"} err="failed to get container status \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": rpc error: code = NotFound desc = could not find container \"ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b\": container with ID starting with ddb2241003c963fe1333297ec450580426fbc99ff1aa27864499b8b1e0ca628b not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.144873 4681 scope.go:117] "RemoveContainer" containerID="fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145212 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0"} err="failed to get container status \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": rpc error: code = NotFound desc = could not find container \"fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0\": container with ID starting with fcae6f4fd84f3eca6b3d30ef51f6c20a5c4b6ec8af16694152209001fa20e1f0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145239 4681 scope.go:117] "RemoveContainer" containerID="ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145535 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0"} err="failed to get container status \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": rpc error: code = NotFound desc = could not find container \"ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0\": container with ID starting with ae6b3d44dfca4a0c8f5a7a5af550270e27c454fa95ad96f72d02b06e01193ea0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145555 4681 scope.go:117] "RemoveContainer" containerID="1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145907 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa"} err="failed to get container status \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": rpc error: code = NotFound desc = could not find container \"1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa\": container with ID starting with 1e75093e6da1e8824eee18d8a8facbd15768297f41ca4b2eb1686311265681fa not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.145930 4681 scope.go:117] "RemoveContainer" containerID="7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.146283 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a"} err="failed to get container status \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": rpc error: code = NotFound desc = could not find container \"7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a\": container with ID starting with 7b6ef8d9032211805382eea22c1465e026245ddf2104dcf4a172f09f0fea5d3a not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.146308 4681 scope.go:117] "RemoveContainer" containerID="42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.146544 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139"} err="failed to get container status \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": rpc error: code = NotFound desc = could not find container \"42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139\": container with ID starting with 42cc130cb69a39da10df4dcc08a1562b2ff7de0886a2c592ca74e8ff6cf02139 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.146576 4681 scope.go:117] "RemoveContainer" containerID="b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.146895 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0"} err="failed to get container status \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": rpc error: code = NotFound desc = could not find container \"b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0\": container with ID starting with b92c3a44c715264a03a02691d8f0a846ce708814edd02a081d648c202ebd55c0 not found: ID does not exist" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.208341 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" event={"ID":"a8a15de5-2d99-41a6-b4c9-7d31c28413b2","Type":"ContainerStarted","Data":"c289641443744133916402a3d74fe55e67952a2f9c3f68dac51cbc64020f4975"} Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.209080 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.210447 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/2.log" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.211650 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" event={"ID":"b603cb8d-41a5-4537-95da-d2e4fa39ce75","Type":"ContainerStarted","Data":"069e15c5e8035a75e461722c20d2cb3d4c5f58002178f608bd478f00ef03a362"} Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.213097 4681 generic.go:334] "Generic (PLEG): container finished" podID="685f728e-7857-4230-88b3-fcf23bcab630" containerID="a8b20f1d6a4b36c90e67173fb17f3ff245fd1269f2875f3c4d55f9b5995215de" exitCode=0 Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.213199 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerDied","Data":"a8b20f1d6a4b36c90e67173fb17f3ff245fd1269f2875f3c4d55f9b5995215de"} Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.213357 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"19567ac44749adbe64bf9da9dfcd1da3e66b6268299692544fc6471cbdcde36a"} Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.224067 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" podStartSLOduration=2.463652143 podStartE2EDuration="18.224053606s" podCreationTimestamp="2025-10-07 17:14:22 +0000 UTC" firstStartedPulling="2025-10-07 17:14:23.00653248 +0000 UTC m=+666.653944035" lastFinishedPulling="2025-10-07 17:14:38.766933943 +0000 UTC m=+682.414345498" observedRunningTime="2025-10-07 17:14:40.22243746 +0000 UTC m=+683.869849035" watchObservedRunningTime="2025-10-07 17:14:40.224053606 +0000 UTC m=+683.871465161" Oct 07 17:14:40 crc kubenswrapper[4681]: I1007 17:14:40.280269 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wvq66" podStartSLOduration=2.482501213 podStartE2EDuration="18.280250099s" podCreationTimestamp="2025-10-07 17:14:22 +0000 UTC" firstStartedPulling="2025-10-07 17:14:22.959271168 +0000 UTC m=+666.606682723" lastFinishedPulling="2025-10-07 17:14:38.757020054 +0000 UTC m=+682.404431609" observedRunningTime="2025-10-07 17:14:40.278858431 +0000 UTC m=+683.926269986" watchObservedRunningTime="2025-10-07 17:14:40.280250099 +0000 UTC m=+683.927661654" Oct 07 17:14:41 crc kubenswrapper[4681]: I1007 17:14:41.035938 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615b8d72-0ec5-42d0-966e-db1c2b787962" path="/var/lib/kubelet/pods/615b8d72-0ec5-42d0-966e-db1c2b787962/volumes" Oct 07 17:14:41 crc kubenswrapper[4681]: I1007 17:14:41.220671 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"5334211ca3cf1c702fd1c6916654900d8476198caae9a47fb7184262c7b6fd57"} Oct 07 17:14:41 crc kubenswrapper[4681]: I1007 17:14:41.220725 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"a60ace68c10620419124ba2f193896412aceba459deb9bc4ac32394262b6e840"} Oct 07 17:14:41 crc kubenswrapper[4681]: I1007 17:14:41.220735 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"ca42e217a33ff553ebf91e5a31f7ca81f7ecb754c0bf1603462ec78014ea1051"} Oct 07 17:14:42 crc kubenswrapper[4681]: I1007 17:14:42.194945 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:14:42 crc kubenswrapper[4681]: I1007 17:14:42.195338 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:14:42 crc kubenswrapper[4681]: I1007 17:14:42.228145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"a52235d11492b3ae03654c076606bc74fe664542e0a92f4e99676bf406ae00c7"} Oct 07 17:14:42 crc kubenswrapper[4681]: I1007 17:14:42.228190 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"f6e12dbc5e6deee19b24263d70cd534c8a7f97df9e23516abbbddafc5dd1b540"} Oct 07 17:14:43 crc kubenswrapper[4681]: I1007 17:14:43.235467 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"2d9e0769f0f1e20e0dc406f91bb713a4ef040dabe2275e08f419c0af7070fad1"} Oct 07 17:14:46 crc kubenswrapper[4681]: I1007 17:14:46.254954 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"997d2db08e40df0bd4c5e605b058ee542b40db1f47ec9e92a527cd97e8b1ddd0"} Oct 07 17:14:47 crc kubenswrapper[4681]: I1007 17:14:47.709303 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-djw5h" Oct 07 17:14:48 crc kubenswrapper[4681]: I1007 17:14:48.270441 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" event={"ID":"685f728e-7857-4230-88b3-fcf23bcab630","Type":"ContainerStarted","Data":"d550781db0ff14e5c43e87c37ce56fe1d0d3e0c3a5470e9d7d66b6828b34da9d"} Oct 07 17:14:48 crc kubenswrapper[4681]: I1007 17:14:48.270764 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:48 crc kubenswrapper[4681]: I1007 17:14:48.270780 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:48 crc kubenswrapper[4681]: I1007 17:14:48.299342 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" podStartSLOduration=10.299327966 podStartE2EDuration="10.299327966s" podCreationTimestamp="2025-10-07 17:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:14:48.297450412 +0000 UTC m=+691.944861967" watchObservedRunningTime="2025-10-07 17:14:48.299327966 +0000 UTC m=+691.946739521" Oct 07 17:14:48 crc kubenswrapper[4681]: I1007 17:14:48.302480 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:49 crc kubenswrapper[4681]: I1007 17:14:49.177910 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:49 crc kubenswrapper[4681]: I1007 17:14:49.204179 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:14:50 crc kubenswrapper[4681]: I1007 17:14:50.029353 4681 scope.go:117] "RemoveContainer" containerID="91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09" Oct 07 17:14:50 crc kubenswrapper[4681]: E1007 17:14:50.029814 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bt6z6_openshift-multus(78a1d2b3-3c0e-49f1-877c-db4f34d3154b)\"" pod="openshift-multus/multus-bt6z6" podUID="78a1d2b3-3c0e-49f1-877c-db4f34d3154b" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.132532 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd"] Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.133806 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.135649 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.137054 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.156389 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd"] Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.258591 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fl7b\" (UniqueName: \"kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.258651 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.258840 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.360032 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.360135 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fl7b\" (UniqueName: \"kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.360190 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.361290 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.376048 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.376468 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fl7b\" (UniqueName: \"kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b\") pod \"collect-profiles-29330955-rkdgd\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: I1007 17:15:00.458398 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: E1007 17:15:00.482472 4681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(3813dfcd599c493a59e63defbb6440d8f313a9c74b44ea5981b07515c83825a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 17:15:00 crc kubenswrapper[4681]: E1007 17:15:00.482587 4681 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(3813dfcd599c493a59e63defbb6440d8f313a9c74b44ea5981b07515c83825a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: E1007 17:15:00.482827 4681 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(3813dfcd599c493a59e63defbb6440d8f313a9c74b44ea5981b07515c83825a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:00 crc kubenswrapper[4681]: E1007 17:15:00.482964 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager(d9aa3d7c-f712-4749-a1d8-a9688c1c3d23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager(d9aa3d7c-f712-4749-a1d8-a9688c1c3d23)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(3813dfcd599c493a59e63defbb6440d8f313a9c74b44ea5981b07515c83825a7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" Oct 07 17:15:01 crc kubenswrapper[4681]: I1007 17:15:01.344246 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:01 crc kubenswrapper[4681]: I1007 17:15:01.345192 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:01 crc kubenswrapper[4681]: E1007 17:15:01.369982 4681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(099f04e121eb98a7319409aa5f21e3c90f6641bbee08ceead8b981386c6325a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 17:15:01 crc kubenswrapper[4681]: E1007 17:15:01.370163 4681 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(099f04e121eb98a7319409aa5f21e3c90f6641bbee08ceead8b981386c6325a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:01 crc kubenswrapper[4681]: E1007 17:15:01.370268 4681 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(099f04e121eb98a7319409aa5f21e3c90f6641bbee08ceead8b981386c6325a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:01 crc kubenswrapper[4681]: E1007 17:15:01.370397 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager(d9aa3d7c-f712-4749-a1d8-a9688c1c3d23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager(d9aa3d7c-f712-4749-a1d8-a9688c1c3d23)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29330955-rkdgd_openshift-operator-lifecycle-manager_d9aa3d7c-f712-4749-a1d8-a9688c1c3d23_0(099f04e121eb98a7319409aa5f21e3c90f6641bbee08ceead8b981386c6325a8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" Oct 07 17:15:05 crc kubenswrapper[4681]: I1007 17:15:05.029214 4681 scope.go:117] "RemoveContainer" containerID="91b648fdfcd673307e0e2e274754851911d53861f02308144f0874b59804ea09" Oct 07 17:15:05 crc kubenswrapper[4681]: I1007 17:15:05.371816 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bt6z6_78a1d2b3-3c0e-49f1-877c-db4f34d3154b/kube-multus/2.log" Oct 07 17:15:05 crc kubenswrapper[4681]: I1007 17:15:05.372181 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bt6z6" event={"ID":"78a1d2b3-3c0e-49f1-877c-db4f34d3154b","Type":"ContainerStarted","Data":"d2c414be64b0401c18defecac7b9e2d402b0ef7d822377d918a7616a026ddedf"} Oct 07 17:15:09 crc kubenswrapper[4681]: I1007 17:15:09.197318 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lrg2s" Oct 07 17:15:12 crc kubenswrapper[4681]: I1007 17:15:12.195612 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:15:12 crc kubenswrapper[4681]: I1007 17:15:12.195674 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:15:15 crc kubenswrapper[4681]: I1007 17:15:15.028600 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:15 crc kubenswrapper[4681]: I1007 17:15:15.029285 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:15 crc kubenswrapper[4681]: I1007 17:15:15.427984 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd"] Oct 07 17:15:15 crc kubenswrapper[4681]: W1007 17:15:15.438016 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9aa3d7c_f712_4749_a1d8_a9688c1c3d23.slice/crio-7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b WatchSource:0}: Error finding container 7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b: Status 404 returned error can't find the container with id 7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b Oct 07 17:15:16 crc kubenswrapper[4681]: I1007 17:15:16.429054 4681 generic.go:334] "Generic (PLEG): container finished" podID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" containerID="f32f915c969fabc27d2e0413dd11d6d252033af1742c668dd39a3b0a31d59730" exitCode=0 Oct 07 17:15:16 crc kubenswrapper[4681]: I1007 17:15:16.429120 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" event={"ID":"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23","Type":"ContainerDied","Data":"f32f915c969fabc27d2e0413dd11d6d252033af1742c668dd39a3b0a31d59730"} Oct 07 17:15:16 crc kubenswrapper[4681]: I1007 17:15:16.429367 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" event={"ID":"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23","Type":"ContainerStarted","Data":"7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b"} Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.627227 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.780981 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume\") pod \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.781080 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fl7b\" (UniqueName: \"kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b\") pod \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.781620 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" (UID: "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.781950 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume\") pod \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\" (UID: \"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23\") " Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.782129 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.785645 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b" (OuterVolumeSpecName: "kube-api-access-7fl7b") pod "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" (UID: "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23"). InnerVolumeSpecName "kube-api-access-7fl7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.785706 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" (UID: "d9aa3d7c-f712-4749-a1d8-a9688c1c3d23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.882640 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fl7b\" (UniqueName: \"kubernetes.io/projected/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-kube-api-access-7fl7b\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:17 crc kubenswrapper[4681]: I1007 17:15:17.882679 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:18 crc kubenswrapper[4681]: I1007 17:15:18.441579 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" event={"ID":"d9aa3d7c-f712-4749-a1d8-a9688c1c3d23","Type":"ContainerDied","Data":"7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b"} Oct 07 17:15:18 crc kubenswrapper[4681]: I1007 17:15:18.441616 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b4ba2314dd57998d35308409e16ff2380a28b583e3945b5e843a3ca4fd7835b" Oct 07 17:15:18 crc kubenswrapper[4681]: I1007 17:15:18.441636 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.291403 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn"] Oct 07 17:15:33 crc kubenswrapper[4681]: E1007 17:15:33.292363 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" containerName="collect-profiles" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.292377 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" containerName="collect-profiles" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.292486 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" containerName="collect-profiles" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.293407 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.302358 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn"] Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.303089 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.462913 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8s2g\" (UniqueName: \"kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.463073 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.463135 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.565053 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.565387 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.565647 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8s2g\" (UniqueName: \"kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.566213 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.566267 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.589537 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8s2g\" (UniqueName: \"kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.610248 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:33 crc kubenswrapper[4681]: I1007 17:15:33.799183 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn"] Oct 07 17:15:34 crc kubenswrapper[4681]: I1007 17:15:34.520314 4681 generic.go:334] "Generic (PLEG): container finished" podID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerID="b303c96f74fb7789e60bd43c293cd36e84ae6e0b0c26c940bac4c1d8303a908b" exitCode=0 Oct 07 17:15:34 crc kubenswrapper[4681]: I1007 17:15:34.520364 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" event={"ID":"d7efbe4b-6c4e-4597-a08a-c65043f2466a","Type":"ContainerDied","Data":"b303c96f74fb7789e60bd43c293cd36e84ae6e0b0c26c940bac4c1d8303a908b"} Oct 07 17:15:34 crc kubenswrapper[4681]: I1007 17:15:34.520589 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" event={"ID":"d7efbe4b-6c4e-4597-a08a-c65043f2466a","Type":"ContainerStarted","Data":"8eede04073cf629d61eb8c2c69755cce0061567e9d8fccdfedfad0bc1c1acd4e"} Oct 07 17:15:36 crc kubenswrapper[4681]: I1007 17:15:36.531738 4681 generic.go:334] "Generic (PLEG): container finished" podID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerID="9bcfce7819a506e0b4ad3772ede1506e8abf76477ed31424a335d8bff9af91ab" exitCode=0 Oct 07 17:15:36 crc kubenswrapper[4681]: I1007 17:15:36.531774 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" event={"ID":"d7efbe4b-6c4e-4597-a08a-c65043f2466a","Type":"ContainerDied","Data":"9bcfce7819a506e0b4ad3772ede1506e8abf76477ed31424a335d8bff9af91ab"} Oct 07 17:15:37 crc kubenswrapper[4681]: I1007 17:15:37.539164 4681 generic.go:334] "Generic (PLEG): container finished" podID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerID="a6616303120b1a7f37e249d6ab147c7cce8cc8d9cd6eb5b05c9fe7370fcef372" exitCode=0 Oct 07 17:15:37 crc kubenswrapper[4681]: I1007 17:15:37.539234 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" event={"ID":"d7efbe4b-6c4e-4597-a08a-c65043f2466a","Type":"ContainerDied","Data":"a6616303120b1a7f37e249d6ab147c7cce8cc8d9cd6eb5b05c9fe7370fcef372"} Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.739223 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.858167 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle\") pod \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.858271 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8s2g\" (UniqueName: \"kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g\") pod \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.858361 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util\") pod \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\" (UID: \"d7efbe4b-6c4e-4597-a08a-c65043f2466a\") " Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.859074 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle" (OuterVolumeSpecName: "bundle") pod "d7efbe4b-6c4e-4597-a08a-c65043f2466a" (UID: "d7efbe4b-6c4e-4597-a08a-c65043f2466a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.862538 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g" (OuterVolumeSpecName: "kube-api-access-g8s2g") pod "d7efbe4b-6c4e-4597-a08a-c65043f2466a" (UID: "d7efbe4b-6c4e-4597-a08a-c65043f2466a"). InnerVolumeSpecName "kube-api-access-g8s2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.872280 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util" (OuterVolumeSpecName: "util") pod "d7efbe4b-6c4e-4597-a08a-c65043f2466a" (UID: "d7efbe4b-6c4e-4597-a08a-c65043f2466a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.959857 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.959976 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8s2g\" (UniqueName: \"kubernetes.io/projected/d7efbe4b-6c4e-4597-a08a-c65043f2466a-kube-api-access-g8s2g\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:38 crc kubenswrapper[4681]: I1007 17:15:38.959997 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7efbe4b-6c4e-4597-a08a-c65043f2466a-util\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:39 crc kubenswrapper[4681]: I1007 17:15:39.549094 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" event={"ID":"d7efbe4b-6c4e-4597-a08a-c65043f2466a","Type":"ContainerDied","Data":"8eede04073cf629d61eb8c2c69755cce0061567e9d8fccdfedfad0bc1c1acd4e"} Oct 07 17:15:39 crc kubenswrapper[4681]: I1007 17:15:39.549334 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eede04073cf629d61eb8c2c69755cce0061567e9d8fccdfedfad0bc1c1acd4e" Oct 07 17:15:39 crc kubenswrapper[4681]: I1007 17:15:39.549156 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn" Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.332913 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.334261 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" containerID="cri-o://9546f84c29a5e91e6d3fa5541f3809ad3574bf869f88501ff464ba6b851f882d" gracePeriod=30 Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.436670 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.437305 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" podUID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" containerName="route-controller-manager" containerID="cri-o://227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51" gracePeriod=30 Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.564766 4681 generic.go:334] "Generic (PLEG): container finished" podID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerID="9546f84c29a5e91e6d3fa5541f3809ad3574bf869f88501ff464ba6b851f882d" exitCode=0 Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.564815 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" event={"ID":"0d8e31b0-652c-44ff-99b0-04ae7d329f6f","Type":"ContainerDied","Data":"9546f84c29a5e91e6d3fa5541f3809ad3574bf869f88501ff464ba6b851f882d"} Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.951802 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.989612 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles\") pod \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.989679 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffpz2\" (UniqueName: \"kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2\") pod \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.989736 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert\") pod \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.989770 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca\") pod \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.989842 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config\") pod \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\" (UID: \"0d8e31b0-652c-44ff-99b0-04ae7d329f6f\") " Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.990826 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config" (OuterVolumeSpecName: "config") pod "0d8e31b0-652c-44ff-99b0-04ae7d329f6f" (UID: "0d8e31b0-652c-44ff-99b0-04ae7d329f6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.991579 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d8e31b0-652c-44ff-99b0-04ae7d329f6f" (UID: "0d8e31b0-652c-44ff-99b0-04ae7d329f6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.991674 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d8e31b0-652c-44ff-99b0-04ae7d329f6f" (UID: "0d8e31b0-652c-44ff-99b0-04ae7d329f6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:41 crc kubenswrapper[4681]: I1007 17:15:41.998214 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2" (OuterVolumeSpecName: "kube-api-access-ffpz2") pod "0d8e31b0-652c-44ff-99b0-04ae7d329f6f" (UID: "0d8e31b0-652c-44ff-99b0-04ae7d329f6f"). InnerVolumeSpecName "kube-api-access-ffpz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.000191 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d8e31b0-652c-44ff-99b0-04ae7d329f6f" (UID: "0d8e31b0-652c-44ff-99b0-04ae7d329f6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.055299 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091211 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca\") pod \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091288 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9sj6\" (UniqueName: \"kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6\") pod \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091334 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert\") pod \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091447 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config\") pod \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\" (UID: \"b9b8f6ae-aa50-4e7a-a59e-359b18fada73\") " Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091672 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091690 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091703 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091716 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.091728 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffpz2\" (UniqueName: \"kubernetes.io/projected/0d8e31b0-652c-44ff-99b0-04ae7d329f6f-kube-api-access-ffpz2\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.092647 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config" (OuterVolumeSpecName: "config") pod "b9b8f6ae-aa50-4e7a-a59e-359b18fada73" (UID: "b9b8f6ae-aa50-4e7a-a59e-359b18fada73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.092978 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9b8f6ae-aa50-4e7a-a59e-359b18fada73" (UID: "b9b8f6ae-aa50-4e7a-a59e-359b18fada73"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.096566 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6" (OuterVolumeSpecName: "kube-api-access-b9sj6") pod "b9b8f6ae-aa50-4e7a-a59e-359b18fada73" (UID: "b9b8f6ae-aa50-4e7a-a59e-359b18fada73"). InnerVolumeSpecName "kube-api-access-b9sj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.097498 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9b8f6ae-aa50-4e7a-a59e-359b18fada73" (UID: "b9b8f6ae-aa50-4e7a-a59e-359b18fada73"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.192589 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.192624 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.192635 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9sj6\" (UniqueName: \"kubernetes.io/projected/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-kube-api-access-b9sj6\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.192647 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b8f6ae-aa50-4e7a-a59e-359b18fada73-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.198073 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.198139 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.198190 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.198631 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.198694 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b" gracePeriod=600 Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.400912 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7"] Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.403023 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" containerName="route-controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.403138 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" containerName="route-controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.403230 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="extract" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.403320 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="extract" Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.403434 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="util" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.403509 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="util" Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.403592 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.403671 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.403750 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="pull" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.403824 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="pull" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.407139 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7efbe4b-6c4e-4597-a08a-c65043f2466a" containerName="extract" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.407310 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" containerName="route-controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.407396 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" containerName="controller-manager" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.408179 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.412354 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.412422 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.434972 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7"] Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.496136 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkkm\" (UniqueName: \"kubernetes.io/projected/d77c5294-44ad-4618-abf8-143fb7872315-kube-api-access-5bkkm\") pod \"nmstate-operator-858ddd8f98-2xfw7\" (UID: \"d77c5294-44ad-4618-abf8-143fb7872315\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.572043 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b" exitCode=0 Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.572116 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b"} Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.572231 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf"} Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.572263 4681 scope.go:117] "RemoveContainer" containerID="caa518685bdba2e7b2d342f42ad137f5c20588f53105f29b5bb3989eba11aede" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.573561 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" event={"ID":"0d8e31b0-652c-44ff-99b0-04ae7d329f6f","Type":"ContainerDied","Data":"8f55dc7cfbe5e9300f49fdba8b028e82f4d16158c752b042caae4733151fa886"} Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.573619 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fst86" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.574943 4681 generic.go:334] "Generic (PLEG): container finished" podID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" containerID="227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51" exitCode=0 Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.574972 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" event={"ID":"b9b8f6ae-aa50-4e7a-a59e-359b18fada73","Type":"ContainerDied","Data":"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51"} Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.574989 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" event={"ID":"b9b8f6ae-aa50-4e7a-a59e-359b18fada73","Type":"ContainerDied","Data":"7463f0e6e7d4975d0033d4e3f5105e1d6f2dd97eaeaa6484480ecf6acfe96dc9"} Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.575045 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.590667 4681 scope.go:117] "RemoveContainer" containerID="9546f84c29a5e91e6d3fa5541f3809ad3574bf869f88501ff464ba6b851f882d" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.597073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkkm\" (UniqueName: \"kubernetes.io/projected/d77c5294-44ad-4618-abf8-143fb7872315-kube-api-access-5bkkm\") pod \"nmstate-operator-858ddd8f98-2xfw7\" (UID: \"d77c5294-44ad-4618-abf8-143fb7872315\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.605557 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.608288 4681 scope.go:117] "RemoveContainer" containerID="227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.610094 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fst86"] Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.616107 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.619410 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-26ntm"] Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.634731 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkkm\" (UniqueName: \"kubernetes.io/projected/d77c5294-44ad-4618-abf8-143fb7872315-kube-api-access-5bkkm\") pod \"nmstate-operator-858ddd8f98-2xfw7\" (UID: \"d77c5294-44ad-4618-abf8-143fb7872315\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.644409 4681 scope.go:117] "RemoveContainer" containerID="227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51" Oct 07 17:15:42 crc kubenswrapper[4681]: E1007 17:15:42.644848 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51\": container with ID starting with 227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51 not found: ID does not exist" containerID="227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.644897 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51"} err="failed to get container status \"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51\": rpc error: code = NotFound desc = could not find container \"227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51\": container with ID starting with 227ec6845a07508f16d53b03b5cb376f23d51d282dbea4bb067fd6e339778a51 not found: ID does not exist" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.736238 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" Oct 07 17:15:42 crc kubenswrapper[4681]: I1007 17:15:42.982268 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.005149 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.005961 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.010554 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.010722 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.011160 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.011297 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.011408 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.011952 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.015573 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.017935 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.019866 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.019999 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.025676 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.025958 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.026309 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.026799 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.027583 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.027800 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.037312 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d8e31b0-652c-44ff-99b0-04ae7d329f6f" path="/var/lib/kubelet/pods/0d8e31b0-652c-44ff-99b0-04ae7d329f6f/volumes" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.039054 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b8f6ae-aa50-4e7a-a59e-359b18fada73" path="/var/lib/kubelet/pods/b9b8f6ae-aa50-4e7a-a59e-359b18fada73/volumes" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.039830 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103421 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-config\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103481 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d16c874e-a345-4f62-9a88-fc7206394f7a-serving-cert\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103512 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqj2k\" (UniqueName: \"kubernetes.io/projected/d16c874e-a345-4f62-9a88-fc7206394f7a-kube-api-access-sqj2k\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103563 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-proxy-ca-bundles\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103580 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-client-ca\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103606 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d202bc3c-4836-4297-aeb4-5de8908b709e-serving-cert\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103641 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5r7\" (UniqueName: \"kubernetes.io/projected/d202bc3c-4836-4297-aeb4-5de8908b709e-kube-api-access-nc5r7\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103677 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-client-ca\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.103700 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-config\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205219 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-client-ca\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205264 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-config\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205286 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-config\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205310 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d16c874e-a345-4f62-9a88-fc7206394f7a-serving-cert\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqj2k\" (UniqueName: \"kubernetes.io/projected/d16c874e-a345-4f62-9a88-fc7206394f7a-kube-api-access-sqj2k\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205384 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-proxy-ca-bundles\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205406 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-client-ca\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205435 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d202bc3c-4836-4297-aeb4-5de8908b709e-serving-cert\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.205470 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc5r7\" (UniqueName: \"kubernetes.io/projected/d202bc3c-4836-4297-aeb4-5de8908b709e-kube-api-access-nc5r7\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.206121 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-client-ca\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.206143 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-client-ca\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.206942 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16c874e-a345-4f62-9a88-fc7206394f7a-config\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.207338 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-proxy-ca-bundles\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.207491 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d202bc3c-4836-4297-aeb4-5de8908b709e-config\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.211963 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d16c874e-a345-4f62-9a88-fc7206394f7a-serving-cert\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.213334 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d202bc3c-4836-4297-aeb4-5de8908b709e-serving-cert\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.222556 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqj2k\" (UniqueName: \"kubernetes.io/projected/d16c874e-a345-4f62-9a88-fc7206394f7a-kube-api-access-sqj2k\") pod \"route-controller-manager-5cb7b97c4b-lnn67\" (UID: \"d16c874e-a345-4f62-9a88-fc7206394f7a\") " pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.223845 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc5r7\" (UniqueName: \"kubernetes.io/projected/d202bc3c-4836-4297-aeb4-5de8908b709e-kube-api-access-nc5r7\") pod \"controller-manager-7f7bd9bd6-5gpwh\" (UID: \"d202bc3c-4836-4297-aeb4-5de8908b709e\") " pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.329736 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.342605 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.585203 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" event={"ID":"d77c5294-44ad-4618-abf8-143fb7872315","Type":"ContainerStarted","Data":"1b88f07934592257978d97694a3fb90bd8a93b620e1dad77ceddec2db2e3d41f"} Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.594234 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh"] Oct 07 17:15:43 crc kubenswrapper[4681]: I1007 17:15:43.714363 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67"] Oct 07 17:15:43 crc kubenswrapper[4681]: W1007 17:15:43.724771 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16c874e_a345_4f62_9a88_fc7206394f7a.slice/crio-762501e757510977390905ed6a5a7de8d82043082492deb7cd973ee7a4b5005d WatchSource:0}: Error finding container 762501e757510977390905ed6a5a7de8d82043082492deb7cd973ee7a4b5005d: Status 404 returned error can't find the container with id 762501e757510977390905ed6a5a7de8d82043082492deb7cd973ee7a4b5005d Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.597685 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" event={"ID":"d16c874e-a345-4f62-9a88-fc7206394f7a","Type":"ContainerStarted","Data":"2f893b72216dda3c46fcdeff2065f3a11f87e106669b1446625d4a72ce999dab"} Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.598995 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" event={"ID":"d16c874e-a345-4f62-9a88-fc7206394f7a","Type":"ContainerStarted","Data":"762501e757510977390905ed6a5a7de8d82043082492deb7cd973ee7a4b5005d"} Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.599122 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.599990 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" event={"ID":"d202bc3c-4836-4297-aeb4-5de8908b709e","Type":"ContainerStarted","Data":"a4fcb4d7576889e9e81968c21684bcc521d07e35009ab23831360ab3741b8c0a"} Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.600021 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" event={"ID":"d202bc3c-4836-4297-aeb4-5de8908b709e","Type":"ContainerStarted","Data":"784cb61e3ae065231dc945a84bf5b57795805d315d283d4b859e66b486daf0f1"} Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.600303 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.603716 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.605172 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.615557 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb7b97c4b-lnn67" podStartSLOduration=3.615541901 podStartE2EDuration="3.615541901s" podCreationTimestamp="2025-10-07 17:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:15:44.615201561 +0000 UTC m=+748.262613116" watchObservedRunningTime="2025-10-07 17:15:44.615541901 +0000 UTC m=+748.262953456" Oct 07 17:15:44 crc kubenswrapper[4681]: I1007 17:15:44.660957 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7bd9bd6-5gpwh" podStartSLOduration=3.6609370930000003 podStartE2EDuration="3.660937093s" podCreationTimestamp="2025-10-07 17:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:15:44.660806309 +0000 UTC m=+748.308217874" watchObservedRunningTime="2025-10-07 17:15:44.660937093 +0000 UTC m=+748.308348648" Oct 07 17:15:46 crc kubenswrapper[4681]: I1007 17:15:46.610944 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" event={"ID":"d77c5294-44ad-4618-abf8-143fb7872315","Type":"ContainerStarted","Data":"2f8a49753fa8d399254ba690c8ea573af4793a297eaa1bb164f3935e4417ce8d"} Oct 07 17:15:46 crc kubenswrapper[4681]: I1007 17:15:46.633016 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2xfw7" podStartSLOduration=1.9410421690000002 podStartE2EDuration="4.633000491s" podCreationTimestamp="2025-10-07 17:15:42 +0000 UTC" firstStartedPulling="2025-10-07 17:15:43.002587327 +0000 UTC m=+746.649998882" lastFinishedPulling="2025-10-07 17:15:45.694545649 +0000 UTC m=+749.341957204" observedRunningTime="2025-10-07 17:15:46.630735819 +0000 UTC m=+750.278147374" watchObservedRunningTime="2025-10-07 17:15:46.633000491 +0000 UTC m=+750.280412046" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.751977 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.753019 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.754568 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-m6st6" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.763734 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flgc9\" (UniqueName: \"kubernetes.io/projected/abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7-kube-api-access-flgc9\") pod \"nmstate-metrics-fdff9cb8d-vkkc9\" (UID: \"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.768492 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.769150 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.774061 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.805456 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lc2dx"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.806137 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.848252 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864557 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-nmstate-lock\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864597 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-dbus-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864645 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2lc4\" (UniqueName: \"kubernetes.io/projected/e37df9f5-e512-4e43-9c96-c193553b43dd-kube-api-access-q2lc4\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864670 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flgc9\" (UniqueName: \"kubernetes.io/projected/abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7-kube-api-access-flgc9\") pod \"nmstate-metrics-fdff9cb8d-vkkc9\" (UID: \"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864703 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-ovs-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864743 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.864774 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk6pg\" (UniqueName: \"kubernetes.io/projected/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-kube-api-access-jk6pg\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.897717 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flgc9\" (UniqueName: \"kubernetes.io/projected/abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7-kube-api-access-flgc9\") pod \"nmstate-metrics-fdff9cb8d-vkkc9\" (UID: \"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.907158 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.955343 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.956380 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.965450 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.965592 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tj7pv" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.965735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-nmstate-lock\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.965696 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-nmstate-lock\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.970921 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971015 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-dbus-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971072 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqlm\" (UniqueName: \"kubernetes.io/projected/d8e7f096-d849-4c9d-8338-2117a554f2de-kube-api-access-wnqlm\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971133 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2lc4\" (UniqueName: \"kubernetes.io/projected/e37df9f5-e512-4e43-9c96-c193553b43dd-kube-api-access-q2lc4\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971223 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e7f096-d849-4c9d-8338-2117a554f2de-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971256 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-ovs-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971309 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.971339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk6pg\" (UniqueName: \"kubernetes.io/projected/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-kube-api-access-jk6pg\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.973139 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-ovs-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: E1007 17:15:47.973285 4681 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 07 17:15:47 crc kubenswrapper[4681]: E1007 17:15:47.973336 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair podName:64eeb4ec-129d-4fc8-be68-138e9c28cd3c nodeName:}" failed. No retries permitted until 2025-10-07 17:15:48.473319026 +0000 UTC m=+752.120730581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair") pod "nmstate-webhook-6cdbc54649-7zfgs" (UID: "64eeb4ec-129d-4fc8-be68-138e9c28cd3c") : secret "openshift-nmstate-webhook" not found Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.973559 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e37df9f5-e512-4e43-9c96-c193553b43dd-dbus-socket\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.978846 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.983132 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h"] Oct 07 17:15:47 crc kubenswrapper[4681]: I1007 17:15:47.992747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2lc4\" (UniqueName: \"kubernetes.io/projected/e37df9f5-e512-4e43-9c96-c193553b43dd-kube-api-access-q2lc4\") pod \"nmstate-handler-lc2dx\" (UID: \"e37df9f5-e512-4e43-9c96-c193553b43dd\") " pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:47.998545 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk6pg\" (UniqueName: \"kubernetes.io/projected/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-kube-api-access-jk6pg\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.072649 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.072703 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqlm\" (UniqueName: \"kubernetes.io/projected/d8e7f096-d849-4c9d-8338-2117a554f2de-kube-api-access-wnqlm\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.072762 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e7f096-d849-4c9d-8338-2117a554f2de-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: E1007 17:15:48.073138 4681 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 07 17:15:48 crc kubenswrapper[4681]: E1007 17:15:48.073228 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert podName:d8e7f096-d849-4c9d-8338-2117a554f2de nodeName:}" failed. No retries permitted until 2025-10-07 17:15:48.573207689 +0000 UTC m=+752.220619244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-klc4h" (UID: "d8e7f096-d849-4c9d-8338-2117a554f2de") : secret "plugin-serving-cert" not found Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.073616 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e7f096-d849-4c9d-8338-2117a554f2de-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.073801 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.104361 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqlm\" (UniqueName: \"kubernetes.io/projected/d8e7f096-d849-4c9d-8338-2117a554f2de-kube-api-access-wnqlm\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.124433 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:48 crc kubenswrapper[4681]: W1007 17:15:48.158188 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37df9f5_e512_4e43_9c96_c193553b43dd.slice/crio-30224cd6b29821c9e4036fe773d414e7d0db807fe9508f87d3515f5f903de838 WatchSource:0}: Error finding container 30224cd6b29821c9e4036fe773d414e7d0db807fe9508f87d3515f5f903de838: Status 404 returned error can't find the container with id 30224cd6b29821c9e4036fe773d414e7d0db807fe9508f87d3515f5f903de838 Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.476368 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.481698 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/64eeb4ec-129d-4fc8-be68-138e9c28cd3c-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7zfgs\" (UID: \"64eeb4ec-129d-4fc8-be68-138e9c28cd3c\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.514457 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9"] Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.577498 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.581179 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e7f096-d849-4c9d-8338-2117a554f2de-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-klc4h\" (UID: \"d8e7f096-d849-4c9d-8338-2117a554f2de\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.591379 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.629837 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lc2dx" event={"ID":"e37df9f5-e512-4e43-9c96-c193553b43dd","Type":"ContainerStarted","Data":"30224cd6b29821c9e4036fe773d414e7d0db807fe9508f87d3515f5f903de838"} Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.632923 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" event={"ID":"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7","Type":"ContainerStarted","Data":"177ac3cb950e8312847b27a20a4b99c2077c1be2a5430a34cb60e577bd2cabf8"} Oct 07 17:15:48 crc kubenswrapper[4681]: I1007 17:15:48.701235 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.004468 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h"] Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.102764 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs"] Oct 07 17:15:49 crc kubenswrapper[4681]: W1007 17:15:49.109220 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64eeb4ec_129d_4fc8_be68_138e9c28cd3c.slice/crio-d8e1ac6e69a46c7a23e038c1b73054cf1b6ccf461869c1b04b310844506e415f WatchSource:0}: Error finding container d8e1ac6e69a46c7a23e038c1b73054cf1b6ccf461869c1b04b310844506e415f: Status 404 returned error can't find the container with id d8e1ac6e69a46c7a23e038c1b73054cf1b6ccf461869c1b04b310844506e415f Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.638778 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" event={"ID":"64eeb4ec-129d-4fc8-be68-138e9c28cd3c","Type":"ContainerStarted","Data":"d8e1ac6e69a46c7a23e038c1b73054cf1b6ccf461869c1b04b310844506e415f"} Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.640299 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" event={"ID":"d8e7f096-d849-4c9d-8338-2117a554f2de","Type":"ContainerStarted","Data":"338a2e0a5ce6b345365267a5e269b2fbd1410e31e62fa10c6effc5a63ded1526"} Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.762713 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bb979c75c-mzkjc"] Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.763747 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.771328 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb979c75c-mzkjc"] Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.798733 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-oauth-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799157 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-service-ca\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799196 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-console-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799232 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-oauth-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799505 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799593 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n826p\" (UniqueName: \"kubernetes.io/projected/6b036693-8a70-4fab-b992-e9a229bc141e-kube-api-access-n826p\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.799669 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-trusted-ca-bundle\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901282 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901329 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n826p\" (UniqueName: \"kubernetes.io/projected/6b036693-8a70-4fab-b992-e9a229bc141e-kube-api-access-n826p\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901364 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-trusted-ca-bundle\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901402 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-oauth-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901433 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-service-ca\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-console-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.901667 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-oauth-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.902451 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-service-ca\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.902567 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-trusted-ca-bundle\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.907343 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-console-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.907757 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b036693-8a70-4fab-b992-e9a229bc141e-oauth-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.908236 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-serving-cert\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.909066 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b036693-8a70-4fab-b992-e9a229bc141e-console-oauth-config\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:49 crc kubenswrapper[4681]: I1007 17:15:49.919519 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n826p\" (UniqueName: \"kubernetes.io/projected/6b036693-8a70-4fab-b992-e9a229bc141e-kube-api-access-n826p\") pod \"console-7bb979c75c-mzkjc\" (UID: \"6b036693-8a70-4fab-b992-e9a229bc141e\") " pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:50 crc kubenswrapper[4681]: I1007 17:15:50.083389 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:15:50 crc kubenswrapper[4681]: I1007 17:15:50.585694 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb979c75c-mzkjc"] Oct 07 17:15:50 crc kubenswrapper[4681]: W1007 17:15:50.598222 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b036693_8a70_4fab_b992_e9a229bc141e.slice/crio-ec2486423b96f773cfe20aec64f569a9433c8ff8d326b8bf4acb7c0f5c3c1a7a WatchSource:0}: Error finding container ec2486423b96f773cfe20aec64f569a9433c8ff8d326b8bf4acb7c0f5c3c1a7a: Status 404 returned error can't find the container with id ec2486423b96f773cfe20aec64f569a9433c8ff8d326b8bf4acb7c0f5c3c1a7a Oct 07 17:15:50 crc kubenswrapper[4681]: I1007 17:15:50.647456 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb979c75c-mzkjc" event={"ID":"6b036693-8a70-4fab-b992-e9a229bc141e","Type":"ContainerStarted","Data":"ec2486423b96f773cfe20aec64f569a9433c8ff8d326b8bf4acb7c0f5c3c1a7a"} Oct 07 17:15:51 crc kubenswrapper[4681]: I1007 17:15:51.504469 4681 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 17:15:52 crc kubenswrapper[4681]: I1007 17:15:52.673976 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb979c75c-mzkjc" event={"ID":"6b036693-8a70-4fab-b992-e9a229bc141e","Type":"ContainerStarted","Data":"72ab0eccb7bae1e6b5aa5cdfe2a01ca1866cc83ab33ca9d186e8980fd27a3907"} Oct 07 17:15:52 crc kubenswrapper[4681]: I1007 17:15:52.692607 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bb979c75c-mzkjc" podStartSLOduration=3.692590358 podStartE2EDuration="3.692590358s" podCreationTimestamp="2025-10-07 17:15:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:15:52.690649255 +0000 UTC m=+756.338060820" watchObservedRunningTime="2025-10-07 17:15:52.692590358 +0000 UTC m=+756.340001913" Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.682413 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" event={"ID":"d8e7f096-d849-4c9d-8338-2117a554f2de","Type":"ContainerStarted","Data":"059517794c594f42145f2cf4d1015e1fb58cf5b099ecb09d57ad4f7539ea623a"} Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.684687 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" event={"ID":"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7","Type":"ContainerStarted","Data":"c03c74f86f2ab6d2f2efe04bf5afc60743e0ae0725b8a98599c8892b5aca867f"} Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.686166 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" event={"ID":"64eeb4ec-129d-4fc8-be68-138e9c28cd3c","Type":"ContainerStarted","Data":"272de6a77c381a20467b03c1bcd32e456c2c26f99bdc97fe7ad68d6a9741de54"} Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.686265 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.687987 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lc2dx" event={"ID":"e37df9f5-e512-4e43-9c96-c193553b43dd","Type":"ContainerStarted","Data":"95e28546e2f427cb3047e153becf950c061b5f31a8d7ee056c32c20ce7ae4d70"} Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.688428 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.701949 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-klc4h" podStartSLOduration=3.164609733 podStartE2EDuration="6.70193081s" podCreationTimestamp="2025-10-07 17:15:47 +0000 UTC" firstStartedPulling="2025-10-07 17:15:49.00866823 +0000 UTC m=+752.656079785" lastFinishedPulling="2025-10-07 17:15:52.545989307 +0000 UTC m=+756.193400862" observedRunningTime="2025-10-07 17:15:53.697773627 +0000 UTC m=+757.345185182" watchObservedRunningTime="2025-10-07 17:15:53.70193081 +0000 UTC m=+757.349342365" Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.743729 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lc2dx" podStartSLOduration=2.357011619 podStartE2EDuration="6.743707383s" podCreationTimestamp="2025-10-07 17:15:47 +0000 UTC" firstStartedPulling="2025-10-07 17:15:48.160357533 +0000 UTC m=+751.807769088" lastFinishedPulling="2025-10-07 17:15:52.547053287 +0000 UTC m=+756.194464852" observedRunningTime="2025-10-07 17:15:53.74102746 +0000 UTC m=+757.388439015" watchObservedRunningTime="2025-10-07 17:15:53.743707383 +0000 UTC m=+757.391118938" Oct 07 17:15:53 crc kubenswrapper[4681]: I1007 17:15:53.761632 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" podStartSLOduration=3.307173592 podStartE2EDuration="6.761604192s" podCreationTimestamp="2025-10-07 17:15:47 +0000 UTC" firstStartedPulling="2025-10-07 17:15:49.111120422 +0000 UTC m=+752.758531977" lastFinishedPulling="2025-10-07 17:15:52.565551022 +0000 UTC m=+756.212962577" observedRunningTime="2025-10-07 17:15:53.756222445 +0000 UTC m=+757.403634000" watchObservedRunningTime="2025-10-07 17:15:53.761604192 +0000 UTC m=+757.409015767" Oct 07 17:15:55 crc kubenswrapper[4681]: I1007 17:15:55.704442 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" event={"ID":"abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7","Type":"ContainerStarted","Data":"91f8ff1f5445e5fa865ccda083736a33d6c2faeb40db0e14ca0bfb93b667c57c"} Oct 07 17:15:55 crc kubenswrapper[4681]: I1007 17:15:55.722822 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vkkc9" podStartSLOduration=2.224571296 podStartE2EDuration="8.722799643s" podCreationTimestamp="2025-10-07 17:15:47 +0000 UTC" firstStartedPulling="2025-10-07 17:15:48.512382254 +0000 UTC m=+752.159793809" lastFinishedPulling="2025-10-07 17:15:55.010610571 +0000 UTC m=+758.658022156" observedRunningTime="2025-10-07 17:15:55.719150463 +0000 UTC m=+759.366562018" watchObservedRunningTime="2025-10-07 17:15:55.722799643 +0000 UTC m=+759.370211188" Oct 07 17:15:58 crc kubenswrapper[4681]: I1007 17:15:58.147512 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lc2dx" Oct 07 17:16:00 crc kubenswrapper[4681]: I1007 17:16:00.085071 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:16:00 crc kubenswrapper[4681]: I1007 17:16:00.085449 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:16:00 crc kubenswrapper[4681]: I1007 17:16:00.091186 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:16:00 crc kubenswrapper[4681]: I1007 17:16:00.742290 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bb979c75c-mzkjc" Oct 07 17:16:00 crc kubenswrapper[4681]: I1007 17:16:00.808783 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:16:08 crc kubenswrapper[4681]: I1007 17:16:08.707368 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7zfgs" Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.897309 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj"] Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.899096 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.901000 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.914691 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj"] Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.953474 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.953541 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsvkf\" (UniqueName: \"kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:20 crc kubenswrapper[4681]: I1007 17:16:20.953609 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.055590 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsvkf\" (UniqueName: \"kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.055634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.055752 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.056499 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.057516 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.082562 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsvkf\" (UniqueName: \"kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.220831 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.439739 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj"] Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.864287 4681 generic.go:334] "Generic (PLEG): container finished" podID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerID="07fc36d6c6b9e9d647aff27873bfb83e825cef6908477e937b59baa59cbb33df" exitCode=0 Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.864522 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" event={"ID":"9112c3b1-a90a-48d2-9282-cd9f4c055d39","Type":"ContainerDied","Data":"07fc36d6c6b9e9d647aff27873bfb83e825cef6908477e937b59baa59cbb33df"} Oct 07 17:16:21 crc kubenswrapper[4681]: I1007 17:16:21.864553 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" event={"ID":"9112c3b1-a90a-48d2-9282-cd9f4c055d39","Type":"ContainerStarted","Data":"e5a791d0e6bb745a21b4d1054b96d0608e648aa831a03abbbd2b0daf76392a93"} Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.263589 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.265346 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.297870 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.396699 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzng4\" (UniqueName: \"kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.396863 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.396990 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.498114 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.498285 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.498380 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzng4\" (UniqueName: \"kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.498673 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.498813 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.522770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzng4\" (UniqueName: \"kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4\") pod \"redhat-operators-kjztk\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.592498 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.822132 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.878560 4681 generic.go:334] "Generic (PLEG): container finished" podID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerID="e2af851256b02aa1777a59971adec8eabd21710e92c07ff99281b77fab51018f" exitCode=0 Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.878631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" event={"ID":"9112c3b1-a90a-48d2-9282-cd9f4c055d39","Type":"ContainerDied","Data":"e2af851256b02aa1777a59971adec8eabd21710e92c07ff99281b77fab51018f"} Oct 07 17:16:23 crc kubenswrapper[4681]: I1007 17:16:23.881145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerStarted","Data":"7903006755b29846d94adfac542c5d58158a49bf569ecc63cdcb6086c32bace2"} Oct 07 17:16:24 crc kubenswrapper[4681]: I1007 17:16:24.888795 4681 generic.go:334] "Generic (PLEG): container finished" podID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerID="8242b7a0c0ab426dc17ac059e0e014ba1e6e1b2854d3c403d6462bd478f2614c" exitCode=0 Oct 07 17:16:24 crc kubenswrapper[4681]: I1007 17:16:24.888900 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" event={"ID":"9112c3b1-a90a-48d2-9282-cd9f4c055d39","Type":"ContainerDied","Data":"8242b7a0c0ab426dc17ac059e0e014ba1e6e1b2854d3c403d6462bd478f2614c"} Oct 07 17:16:24 crc kubenswrapper[4681]: I1007 17:16:24.890525 4681 generic.go:334] "Generic (PLEG): container finished" podID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerID="25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db" exitCode=0 Oct 07 17:16:24 crc kubenswrapper[4681]: I1007 17:16:24.890575 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerDied","Data":"25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db"} Oct 07 17:16:25 crc kubenswrapper[4681]: I1007 17:16:25.877632 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nds8d" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" containerName="console" containerID="cri-o://8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb" gracePeriod=15 Oct 07 17:16:25 crc kubenswrapper[4681]: I1007 17:16:25.897312 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerStarted","Data":"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c"} Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.143787 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.232242 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsvkf\" (UniqueName: \"kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf\") pod \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.232334 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util\") pod \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.232390 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle\") pod \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\" (UID: \"9112c3b1-a90a-48d2-9282-cd9f4c055d39\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.236170 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle" (OuterVolumeSpecName: "bundle") pod "9112c3b1-a90a-48d2-9282-cd9f4c055d39" (UID: "9112c3b1-a90a-48d2-9282-cd9f4c055d39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.237079 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf" (OuterVolumeSpecName: "kube-api-access-dsvkf") pod "9112c3b1-a90a-48d2-9282-cd9f4c055d39" (UID: "9112c3b1-a90a-48d2-9282-cd9f4c055d39"). InnerVolumeSpecName "kube-api-access-dsvkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.246591 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util" (OuterVolumeSpecName: "util") pod "9112c3b1-a90a-48d2-9282-cd9f4c055d39" (UID: "9112c3b1-a90a-48d2-9282-cd9f4c055d39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.297493 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nds8d_dff981f7-635e-4b45-bf64-fbb57407582b/console/0.log" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.297562 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335221 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335275 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335310 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx96w\" (UniqueName: \"kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335343 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335365 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335401 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335436 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca\") pod \"dff981f7-635e-4b45-bf64-fbb57407582b\" (UID: \"dff981f7-635e-4b45-bf64-fbb57407582b\") " Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335621 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-util\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335638 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9112c3b1-a90a-48d2-9282-cd9f4c055d39-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.335651 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsvkf\" (UniqueName: \"kubernetes.io/projected/9112c3b1-a90a-48d2-9282-cd9f4c055d39-kube-api-access-dsvkf\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.336132 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.336303 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config" (OuterVolumeSpecName: "console-config") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.336335 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.336553 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca" (OuterVolumeSpecName: "service-ca") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.338454 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.338955 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.339113 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w" (OuterVolumeSpecName: "kube-api-access-bx96w") pod "dff981f7-635e-4b45-bf64-fbb57407582b" (UID: "dff981f7-635e-4b45-bf64-fbb57407582b"). InnerVolumeSpecName "kube-api-access-bx96w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436461 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx96w\" (UniqueName: \"kubernetes.io/projected/dff981f7-635e-4b45-bf64-fbb57407582b-kube-api-access-bx96w\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436517 4681 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436527 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436535 4681 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436544 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436552 4681 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dff981f7-635e-4b45-bf64-fbb57407582b-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.436580 4681 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dff981f7-635e-4b45-bf64-fbb57407582b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.903229 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" event={"ID":"9112c3b1-a90a-48d2-9282-cd9f4c055d39","Type":"ContainerDied","Data":"e5a791d0e6bb745a21b4d1054b96d0608e648aa831a03abbbd2b0daf76392a93"} Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.903263 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a791d0e6bb745a21b4d1054b96d0608e648aa831a03abbbd2b0daf76392a93" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.903325 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.908553 4681 generic.go:334] "Generic (PLEG): container finished" podID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerID="77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c" exitCode=0 Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.908846 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerDied","Data":"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c"} Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914315 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nds8d_dff981f7-635e-4b45-bf64-fbb57407582b/console/0.log" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914382 4681 generic.go:334] "Generic (PLEG): container finished" podID="dff981f7-635e-4b45-bf64-fbb57407582b" containerID="8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb" exitCode=2 Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914448 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nds8d" event={"ID":"dff981f7-635e-4b45-bf64-fbb57407582b","Type":"ContainerDied","Data":"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb"} Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914471 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nds8d" event={"ID":"dff981f7-635e-4b45-bf64-fbb57407582b","Type":"ContainerDied","Data":"16e27d1919847670c69052d109897b8b3250074e3ca585a5aa77da7b9c890275"} Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914516 4681 scope.go:117] "RemoveContainer" containerID="8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.914695 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nds8d" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.943043 4681 scope.go:117] "RemoveContainer" containerID="8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb" Oct 07 17:16:26 crc kubenswrapper[4681]: E1007 17:16:26.943623 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb\": container with ID starting with 8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb not found: ID does not exist" containerID="8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.943656 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb"} err="failed to get container status \"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb\": rpc error: code = NotFound desc = could not find container \"8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb\": container with ID starting with 8b048feef8f0f3c653902316fd6e5a4412ccbeb311847b559e5d1bf61e5c96fb not found: ID does not exist" Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.958849 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:16:26 crc kubenswrapper[4681]: I1007 17:16:26.961606 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nds8d"] Oct 07 17:16:27 crc kubenswrapper[4681]: I1007 17:16:27.035685 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" path="/var/lib/kubelet/pods/dff981f7-635e-4b45-bf64-fbb57407582b/volumes" Oct 07 17:16:27 crc kubenswrapper[4681]: I1007 17:16:27.921242 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerStarted","Data":"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999"} Oct 07 17:16:27 crc kubenswrapper[4681]: I1007 17:16:27.938738 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kjztk" podStartSLOduration=2.443939173 podStartE2EDuration="4.938718249s" podCreationTimestamp="2025-10-07 17:16:23 +0000 UTC" firstStartedPulling="2025-10-07 17:16:24.892277326 +0000 UTC m=+788.539688881" lastFinishedPulling="2025-10-07 17:16:27.387056402 +0000 UTC m=+791.034467957" observedRunningTime="2025-10-07 17:16:27.936597591 +0000 UTC m=+791.584009146" watchObservedRunningTime="2025-10-07 17:16:27.938718249 +0000 UTC m=+791.586129804" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.458801 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:33 crc kubenswrapper[4681]: E1007 17:16:33.459295 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="pull" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459308 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="pull" Oct 07 17:16:33 crc kubenswrapper[4681]: E1007 17:16:33.459319 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="util" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459328 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="util" Oct 07 17:16:33 crc kubenswrapper[4681]: E1007 17:16:33.459348 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" containerName="console" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459356 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" containerName="console" Oct 07 17:16:33 crc kubenswrapper[4681]: E1007 17:16:33.459368 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="extract" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459374 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="extract" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459497 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff981f7-635e-4b45-bf64-fbb57407582b" containerName="console" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.459515 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9112c3b1-a90a-48d2-9282-cd9f4c055d39" containerName="extract" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.460411 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.471111 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.583516 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.583671 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.583727 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gth\" (UniqueName: \"kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.593194 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.593241 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.640950 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.686166 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.686344 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.686405 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gth\" (UniqueName: \"kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.686975 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.686992 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.712535 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gth\" (UniqueName: \"kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth\") pod \"community-operators-qmdst\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:33 crc kubenswrapper[4681]: I1007 17:16:33.779538 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:34 crc kubenswrapper[4681]: I1007 17:16:34.014243 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:34 crc kubenswrapper[4681]: I1007 17:16:34.245340 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:34 crc kubenswrapper[4681]: I1007 17:16:34.958779 4681 generic.go:334] "Generic (PLEG): container finished" podID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerID="6658a786c201100d1880b0abc05808bf16705b99cb89415f4ef308b66bf2bdda" exitCode=0 Oct 07 17:16:34 crc kubenswrapper[4681]: I1007 17:16:34.959897 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerDied","Data":"6658a786c201100d1880b0abc05808bf16705b99cb89415f4ef308b66bf2bdda"} Oct 07 17:16:34 crc kubenswrapper[4681]: I1007 17:16:34.959928 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerStarted","Data":"6e83bb58fb155391a97a235a2a782efc322b3fae0e3f796a511506ce78deae20"} Oct 07 17:16:36 crc kubenswrapper[4681]: I1007 17:16:36.851449 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:36 crc kubenswrapper[4681]: I1007 17:16:36.852031 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kjztk" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="registry-server" containerID="cri-o://c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999" gracePeriod=2 Oct 07 17:16:36 crc kubenswrapper[4681]: I1007 17:16:36.970542 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerStarted","Data":"2255049d3ad723796faa3881c8129c40fe981c97b66ebd2decf4eae79df25261"} Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.126830 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-598476574-wb9sj"] Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.127547 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.132135 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.132159 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.132165 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.132609 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7gzcn" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.132650 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.137445 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-apiservice-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.137491 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5ps\" (UniqueName: \"kubernetes.io/projected/abb48906-478a-4687-9e03-76d9035242b8-kube-api-access-nr5ps\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.137521 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-webhook-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.151476 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-598476574-wb9sj"] Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.238968 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-apiservice-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.239033 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5ps\" (UniqueName: \"kubernetes.io/projected/abb48906-478a-4687-9e03-76d9035242b8-kube-api-access-nr5ps\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.239080 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-webhook-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.245839 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-apiservice-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.246430 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abb48906-478a-4687-9e03-76d9035242b8-webhook-cert\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.270421 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5ps\" (UniqueName: \"kubernetes.io/projected/abb48906-478a-4687-9e03-76d9035242b8-kube-api-access-nr5ps\") pod \"metallb-operator-controller-manager-598476574-wb9sj\" (UID: \"abb48906-478a-4687-9e03-76d9035242b8\") " pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.410353 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk"] Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.411285 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.413627 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.414568 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xnxmt" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.414634 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.433482 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk"] Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.446058 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.543249 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-webhook-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.543531 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-apiservice-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.543567 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74ww6\" (UniqueName: \"kubernetes.io/projected/350e1b80-5296-4a5b-a604-e9a42b56cbd1-kube-api-access-74ww6\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.644543 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-webhook-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.644602 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-apiservice-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.644636 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74ww6\" (UniqueName: \"kubernetes.io/projected/350e1b80-5296-4a5b-a604-e9a42b56cbd1-kube-api-access-74ww6\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.651153 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-apiservice-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.664588 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350e1b80-5296-4a5b-a604-e9a42b56cbd1-webhook-cert\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.664920 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74ww6\" (UniqueName: \"kubernetes.io/projected/350e1b80-5296-4a5b-a604-e9a42b56cbd1-kube-api-access-74ww6\") pod \"metallb-operator-webhook-server-5996f7f8c8-n6prk\" (UID: \"350e1b80-5296-4a5b-a604-e9a42b56cbd1\") " pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.738131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.923727 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.950613 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzng4\" (UniqueName: \"kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4\") pod \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.950659 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities\") pod \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.950686 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content\") pod \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\" (UID: \"0f29d3c6-1251-4e35-abb5-87df61a17eaa\") " Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.954510 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities" (OuterVolumeSpecName: "utilities") pod "0f29d3c6-1251-4e35-abb5-87df61a17eaa" (UID: "0f29d3c6-1251-4e35-abb5-87df61a17eaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.973953 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4" (OuterVolumeSpecName: "kube-api-access-kzng4") pod "0f29d3c6-1251-4e35-abb5-87df61a17eaa" (UID: "0f29d3c6-1251-4e35-abb5-87df61a17eaa"). InnerVolumeSpecName "kube-api-access-kzng4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.981917 4681 generic.go:334] "Generic (PLEG): container finished" podID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerID="c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999" exitCode=0 Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.981977 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kjztk" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.982003 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerDied","Data":"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999"} Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.982030 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kjztk" event={"ID":"0f29d3c6-1251-4e35-abb5-87df61a17eaa","Type":"ContainerDied","Data":"7903006755b29846d94adfac542c5d58158a49bf569ecc63cdcb6086c32bace2"} Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.982049 4681 scope.go:117] "RemoveContainer" containerID="c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999" Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.983977 4681 generic.go:334] "Generic (PLEG): container finished" podID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerID="2255049d3ad723796faa3881c8129c40fe981c97b66ebd2decf4eae79df25261" exitCode=0 Oct 07 17:16:37 crc kubenswrapper[4681]: I1007 17:16:37.984094 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerDied","Data":"2255049d3ad723796faa3881c8129c40fe981c97b66ebd2decf4eae79df25261"} Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.034356 4681 scope.go:117] "RemoveContainer" containerID="77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.056705 4681 scope.go:117] "RemoveContainer" containerID="25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.057630 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzng4\" (UniqueName: \"kubernetes.io/projected/0f29d3c6-1251-4e35-abb5-87df61a17eaa-kube-api-access-kzng4\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.057650 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.072603 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f29d3c6-1251-4e35-abb5-87df61a17eaa" (UID: "0f29d3c6-1251-4e35-abb5-87df61a17eaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.109387 4681 scope.go:117] "RemoveContainer" containerID="c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.110147 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-598476574-wb9sj"] Oct 07 17:16:38 crc kubenswrapper[4681]: E1007 17:16:38.110398 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999\": container with ID starting with c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999 not found: ID does not exist" containerID="c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.110457 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999"} err="failed to get container status \"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999\": rpc error: code = NotFound desc = could not find container \"c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999\": container with ID starting with c21c77094509d1c05e24b6b3692af100eef11f38c9336fe9bfc0196952231999 not found: ID does not exist" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.110479 4681 scope.go:117] "RemoveContainer" containerID="77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c" Oct 07 17:16:38 crc kubenswrapper[4681]: E1007 17:16:38.113972 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c\": container with ID starting with 77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c not found: ID does not exist" containerID="77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.114009 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c"} err="failed to get container status \"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c\": rpc error: code = NotFound desc = could not find container \"77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c\": container with ID starting with 77d5b2a90069f16918bb91bcbb09f8e97937f60092bba5a76f5a76a07d68782c not found: ID does not exist" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.114033 4681 scope.go:117] "RemoveContainer" containerID="25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db" Oct 07 17:16:38 crc kubenswrapper[4681]: E1007 17:16:38.115211 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db\": container with ID starting with 25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db not found: ID does not exist" containerID="25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.115252 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db"} err="failed to get container status \"25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db\": rpc error: code = NotFound desc = could not find container \"25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db\": container with ID starting with 25079eec9f0afce4e89b2e1d2e17de68fe85b8cf5088be07101eec9dcdc8d0db not found: ID does not exist" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.139592 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk"] Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.160967 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f29d3c6-1251-4e35-abb5-87df61a17eaa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.336966 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.341732 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kjztk"] Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.990185 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" event={"ID":"abb48906-478a-4687-9e03-76d9035242b8","Type":"ContainerStarted","Data":"c63189e15eae4d124255a7bfc965f25074b149ab814a4c9da4265d90d47ba7b9"} Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.991380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" event={"ID":"350e1b80-5296-4a5b-a604-e9a42b56cbd1","Type":"ContainerStarted","Data":"44696276af97f0f0d956456083acbdc2d6188d010bd0bcc97098e0ed99050f3a"} Oct 07 17:16:38 crc kubenswrapper[4681]: I1007 17:16:38.994223 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerStarted","Data":"2d7bafcddd53e721c49607aa3a2ebfe2be874eb8ec363f288550e4e5d6361405"} Oct 07 17:16:39 crc kubenswrapper[4681]: I1007 17:16:39.016985 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmdst" podStartSLOduration=2.504739529 podStartE2EDuration="6.01696689s" podCreationTimestamp="2025-10-07 17:16:33 +0000 UTC" firstStartedPulling="2025-10-07 17:16:34.960725944 +0000 UTC m=+798.608137499" lastFinishedPulling="2025-10-07 17:16:38.472953305 +0000 UTC m=+802.120364860" observedRunningTime="2025-10-07 17:16:39.015556612 +0000 UTC m=+802.662968167" watchObservedRunningTime="2025-10-07 17:16:39.01696689 +0000 UTC m=+802.664378445" Oct 07 17:16:39 crc kubenswrapper[4681]: I1007 17:16:39.036930 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" path="/var/lib/kubelet/pods/0f29d3c6-1251-4e35-abb5-87df61a17eaa/volumes" Oct 07 17:16:42 crc kubenswrapper[4681]: I1007 17:16:42.015288 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" event={"ID":"abb48906-478a-4687-9e03-76d9035242b8","Type":"ContainerStarted","Data":"b01fe3e84bcf569a7b886aa039468be9947f419ed64e81124ccd67004ad00abb"} Oct 07 17:16:42 crc kubenswrapper[4681]: I1007 17:16:42.015534 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:16:42 crc kubenswrapper[4681]: I1007 17:16:42.035663 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" podStartSLOduration=1.59619021 podStartE2EDuration="5.035642574s" podCreationTimestamp="2025-10-07 17:16:37 +0000 UTC" firstStartedPulling="2025-10-07 17:16:38.126249537 +0000 UTC m=+801.773661082" lastFinishedPulling="2025-10-07 17:16:41.565701891 +0000 UTC m=+805.213113446" observedRunningTime="2025-10-07 17:16:42.030196523 +0000 UTC m=+805.677608078" watchObservedRunningTime="2025-10-07 17:16:42.035642574 +0000 UTC m=+805.683054129" Oct 07 17:16:43 crc kubenswrapper[4681]: I1007 17:16:43.780128 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:43 crc kubenswrapper[4681]: I1007 17:16:43.780443 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:43 crc kubenswrapper[4681]: I1007 17:16:43.834920 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:44 crc kubenswrapper[4681]: I1007 17:16:44.066315 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.040105 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" event={"ID":"350e1b80-5296-4a5b-a604-e9a42b56cbd1","Type":"ContainerStarted","Data":"4171d70de1eb3bf97d57ab5da8111b860815f0a758f1fbc6e1c21afaa17709f9"} Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.040427 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.060812 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" podStartSLOduration=2.178200778 podStartE2EDuration="9.060787659s" podCreationTimestamp="2025-10-07 17:16:37 +0000 UTC" firstStartedPulling="2025-10-07 17:16:38.162850081 +0000 UTC m=+801.810261636" lastFinishedPulling="2025-10-07 17:16:45.045436972 +0000 UTC m=+808.692848517" observedRunningTime="2025-10-07 17:16:46.057375564 +0000 UTC m=+809.704787119" watchObservedRunningTime="2025-10-07 17:16:46.060787659 +0000 UTC m=+809.708199214" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.455819 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:16:46 crc kubenswrapper[4681]: E1007 17:16:46.456120 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="extract-utilities" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.456140 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="extract-utilities" Oct 07 17:16:46 crc kubenswrapper[4681]: E1007 17:16:46.456156 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="extract-content" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.456165 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="extract-content" Oct 07 17:16:46 crc kubenswrapper[4681]: E1007 17:16:46.456177 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="registry-server" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.456187 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="registry-server" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.456340 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f29d3c6-1251-4e35-abb5-87df61a17eaa" containerName="registry-server" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.457311 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.471696 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.568093 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.568199 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kznsd\" (UniqueName: \"kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.568222 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.668852 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kznsd\" (UniqueName: \"kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.669211 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.669633 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.669699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.669986 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.686432 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kznsd\" (UniqueName: \"kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd\") pod \"redhat-marketplace-d8mcr\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:46 crc kubenswrapper[4681]: I1007 17:16:46.777010 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:47 crc kubenswrapper[4681]: I1007 17:16:47.224199 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:16:47 crc kubenswrapper[4681]: I1007 17:16:47.453651 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:47 crc kubenswrapper[4681]: I1007 17:16:47.454513 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmdst" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="registry-server" containerID="cri-o://2d7bafcddd53e721c49607aa3a2ebfe2be874eb8ec363f288550e4e5d6361405" gracePeriod=2 Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.054794 4681 generic.go:334] "Generic (PLEG): container finished" podID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerID="2d7bafcddd53e721c49607aa3a2ebfe2be874eb8ec363f288550e4e5d6361405" exitCode=0 Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.054863 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerDied","Data":"2d7bafcddd53e721c49607aa3a2ebfe2be874eb8ec363f288550e4e5d6361405"} Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.056451 4681 generic.go:334] "Generic (PLEG): container finished" podID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerID="6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1" exitCode=0 Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.056495 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerDied","Data":"6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1"} Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.056538 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerStarted","Data":"81df1071f242f089d20e0f31a1beb1dc1618f3faf7f7ccf2722347bdb92fa706"} Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.353035 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.489027 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content\") pod \"7066ab9a-9c49-4489-bff4-d631a6e222db\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.489317 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities\") pod \"7066ab9a-9c49-4489-bff4-d631a6e222db\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.489393 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45gth\" (UniqueName: \"kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth\") pod \"7066ab9a-9c49-4489-bff4-d631a6e222db\" (UID: \"7066ab9a-9c49-4489-bff4-d631a6e222db\") " Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.490623 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities" (OuterVolumeSpecName: "utilities") pod "7066ab9a-9c49-4489-bff4-d631a6e222db" (UID: "7066ab9a-9c49-4489-bff4-d631a6e222db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.493867 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth" (OuterVolumeSpecName: "kube-api-access-45gth") pod "7066ab9a-9c49-4489-bff4-d631a6e222db" (UID: "7066ab9a-9c49-4489-bff4-d631a6e222db"). InnerVolumeSpecName "kube-api-access-45gth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.546729 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7066ab9a-9c49-4489-bff4-d631a6e222db" (UID: "7066ab9a-9c49-4489-bff4-d631a6e222db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.590418 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.590451 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7066ab9a-9c49-4489-bff4-d631a6e222db-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:48 crc kubenswrapper[4681]: I1007 17:16:48.590460 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45gth\" (UniqueName: \"kubernetes.io/projected/7066ab9a-9c49-4489-bff4-d631a6e222db-kube-api-access-45gth\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.062797 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmdst" event={"ID":"7066ab9a-9c49-4489-bff4-d631a6e222db","Type":"ContainerDied","Data":"6e83bb58fb155391a97a235a2a782efc322b3fae0e3f796a511506ce78deae20"} Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.062863 4681 scope.go:117] "RemoveContainer" containerID="2d7bafcddd53e721c49607aa3a2ebfe2be874eb8ec363f288550e4e5d6361405" Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.062817 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmdst" Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.065527 4681 generic.go:334] "Generic (PLEG): container finished" podID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerID="f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58" exitCode=0 Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.065557 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerDied","Data":"f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58"} Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.079692 4681 scope.go:117] "RemoveContainer" containerID="2255049d3ad723796faa3881c8129c40fe981c97b66ebd2decf4eae79df25261" Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.122140 4681 scope.go:117] "RemoveContainer" containerID="6658a786c201100d1880b0abc05808bf16705b99cb89415f4ef308b66bf2bdda" Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.130014 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:49 crc kubenswrapper[4681]: I1007 17:16:49.136824 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmdst"] Oct 07 17:16:50 crc kubenswrapper[4681]: I1007 17:16:50.074690 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerStarted","Data":"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79"} Oct 07 17:16:50 crc kubenswrapper[4681]: I1007 17:16:50.095388 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d8mcr" podStartSLOduration=2.401189757 podStartE2EDuration="4.095362706s" podCreationTimestamp="2025-10-07 17:16:46 +0000 UTC" firstStartedPulling="2025-10-07 17:16:48.057903683 +0000 UTC m=+811.705315238" lastFinishedPulling="2025-10-07 17:16:49.752076632 +0000 UTC m=+813.399488187" observedRunningTime="2025-10-07 17:16:50.092547678 +0000 UTC m=+813.739959223" watchObservedRunningTime="2025-10-07 17:16:50.095362706 +0000 UTC m=+813.742774261" Oct 07 17:16:51 crc kubenswrapper[4681]: I1007 17:16:51.043325 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" path="/var/lib/kubelet/pods/7066ab9a-9c49-4489-bff4-d631a6e222db/volumes" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.282973 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:16:53 crc kubenswrapper[4681]: E1007 17:16:53.284207 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="extract-content" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.284298 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="extract-content" Oct 07 17:16:53 crc kubenswrapper[4681]: E1007 17:16:53.284362 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="extract-utilities" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.284412 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="extract-utilities" Oct 07 17:16:53 crc kubenswrapper[4681]: E1007 17:16:53.284476 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="registry-server" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.284524 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="registry-server" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.284782 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7066ab9a-9c49-4489-bff4-d631a6e222db" containerName="registry-server" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.285655 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.307343 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.458389 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.458427 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9qx\" (UniqueName: \"kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.458455 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.559992 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.560044 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9qx\" (UniqueName: \"kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.560076 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.560630 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.560626 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.591103 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9qx\" (UniqueName: \"kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx\") pod \"certified-operators-dxv78\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:53 crc kubenswrapper[4681]: I1007 17:16:53.602256 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:16:54 crc kubenswrapper[4681]: I1007 17:16:54.149490 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:16:55 crc kubenswrapper[4681]: I1007 17:16:55.100035 4681 generic.go:334] "Generic (PLEG): container finished" podID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerID="0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6" exitCode=0 Oct 07 17:16:55 crc kubenswrapper[4681]: I1007 17:16:55.100142 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerDied","Data":"0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6"} Oct 07 17:16:55 crc kubenswrapper[4681]: I1007 17:16:55.100345 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerStarted","Data":"a93ddb04d8567aef5c78aa8b29bec47bda87c8f1f709e6925ef6d690a823b998"} Oct 07 17:16:56 crc kubenswrapper[4681]: I1007 17:16:56.106505 4681 generic.go:334] "Generic (PLEG): container finished" podID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerID="9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8" exitCode=0 Oct 07 17:16:56 crc kubenswrapper[4681]: I1007 17:16:56.106744 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerDied","Data":"9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8"} Oct 07 17:16:56 crc kubenswrapper[4681]: I1007 17:16:56.778756 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:56 crc kubenswrapper[4681]: I1007 17:16:56.779058 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:56 crc kubenswrapper[4681]: I1007 17:16:56.826310 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:57 crc kubenswrapper[4681]: I1007 17:16:57.115282 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerStarted","Data":"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391"} Oct 07 17:16:57 crc kubenswrapper[4681]: I1007 17:16:57.140157 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxv78" podStartSLOduration=2.747434246 podStartE2EDuration="4.14013029s" podCreationTimestamp="2025-10-07 17:16:53 +0000 UTC" firstStartedPulling="2025-10-07 17:16:55.102400211 +0000 UTC m=+818.749811766" lastFinishedPulling="2025-10-07 17:16:56.495096255 +0000 UTC m=+820.142507810" observedRunningTime="2025-10-07 17:16:57.13762427 +0000 UTC m=+820.785035825" watchObservedRunningTime="2025-10-07 17:16:57.14013029 +0000 UTC m=+820.787541855" Oct 07 17:16:57 crc kubenswrapper[4681]: I1007 17:16:57.164315 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:57 crc kubenswrapper[4681]: I1007 17:16:57.743915 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5996f7f8c8-n6prk" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.251404 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.254779 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d8mcr" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="registry-server" containerID="cri-o://f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79" gracePeriod=2 Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.743561 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.749918 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities\") pod \"972f9fea-fcbb-4992-853e-4774ecef09e2\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.750036 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kznsd\" (UniqueName: \"kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd\") pod \"972f9fea-fcbb-4992-853e-4774ecef09e2\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.750148 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content\") pod \"972f9fea-fcbb-4992-853e-4774ecef09e2\" (UID: \"972f9fea-fcbb-4992-853e-4774ecef09e2\") " Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.750761 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities" (OuterVolumeSpecName: "utilities") pod "972f9fea-fcbb-4992-853e-4774ecef09e2" (UID: "972f9fea-fcbb-4992-853e-4774ecef09e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.755586 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd" (OuterVolumeSpecName: "kube-api-access-kznsd") pod "972f9fea-fcbb-4992-853e-4774ecef09e2" (UID: "972f9fea-fcbb-4992-853e-4774ecef09e2"). InnerVolumeSpecName "kube-api-access-kznsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.764271 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "972f9fea-fcbb-4992-853e-4774ecef09e2" (UID: "972f9fea-fcbb-4992-853e-4774ecef09e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.851357 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.851394 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/972f9fea-fcbb-4992-853e-4774ecef09e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:16:59 crc kubenswrapper[4681]: I1007 17:16:59.851405 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kznsd\" (UniqueName: \"kubernetes.io/projected/972f9fea-fcbb-4992-853e-4774ecef09e2-kube-api-access-kznsd\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.133738 4681 generic.go:334] "Generic (PLEG): container finished" podID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerID="f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79" exitCode=0 Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.133785 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerDied","Data":"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79"} Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.133812 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d8mcr" event={"ID":"972f9fea-fcbb-4992-853e-4774ecef09e2","Type":"ContainerDied","Data":"81df1071f242f089d20e0f31a1beb1dc1618f3faf7f7ccf2722347bdb92fa706"} Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.133829 4681 scope.go:117] "RemoveContainer" containerID="f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.133990 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d8mcr" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.156714 4681 scope.go:117] "RemoveContainer" containerID="f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.178406 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.182420 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d8mcr"] Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.183642 4681 scope.go:117] "RemoveContainer" containerID="6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.202652 4681 scope.go:117] "RemoveContainer" containerID="f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79" Oct 07 17:17:00 crc kubenswrapper[4681]: E1007 17:17:00.203153 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79\": container with ID starting with f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79 not found: ID does not exist" containerID="f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.203211 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79"} err="failed to get container status \"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79\": rpc error: code = NotFound desc = could not find container \"f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79\": container with ID starting with f6010b4104717951dc22ab4817bde22ea3e493499748279a1ce2f64e20276e79 not found: ID does not exist" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.203251 4681 scope.go:117] "RemoveContainer" containerID="f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58" Oct 07 17:17:00 crc kubenswrapper[4681]: E1007 17:17:00.203660 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58\": container with ID starting with f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58 not found: ID does not exist" containerID="f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.203695 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58"} err="failed to get container status \"f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58\": rpc error: code = NotFound desc = could not find container \"f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58\": container with ID starting with f8f317bfbcc9be3b2f20e11fa9df40ca5382c24f683829ce42217add76c60f58 not found: ID does not exist" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.203721 4681 scope.go:117] "RemoveContainer" containerID="6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1" Oct 07 17:17:00 crc kubenswrapper[4681]: E1007 17:17:00.204193 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1\": container with ID starting with 6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1 not found: ID does not exist" containerID="6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1" Oct 07 17:17:00 crc kubenswrapper[4681]: I1007 17:17:00.204238 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1"} err="failed to get container status \"6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1\": rpc error: code = NotFound desc = could not find container \"6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1\": container with ID starting with 6a30851c19c5f5fea131701cde2c082cc60d4cdfe27452e6563a39506374b1a1 not found: ID does not exist" Oct 07 17:17:01 crc kubenswrapper[4681]: I1007 17:17:01.047726 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" path="/var/lib/kubelet/pods/972f9fea-fcbb-4992-853e-4774ecef09e2/volumes" Oct 07 17:17:03 crc kubenswrapper[4681]: I1007 17:17:03.602716 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:03 crc kubenswrapper[4681]: I1007 17:17:03.602777 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:03 crc kubenswrapper[4681]: I1007 17:17:03.651433 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:04 crc kubenswrapper[4681]: I1007 17:17:04.192483 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:05 crc kubenswrapper[4681]: I1007 17:17:05.248555 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.165012 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dxv78" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="registry-server" containerID="cri-o://a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391" gracePeriod=2 Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.620922 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.634281 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities\") pod \"eaccb5ab-615d-41af-aa4e-39b88c532883\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.634433 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9qx\" (UniqueName: \"kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx\") pod \"eaccb5ab-615d-41af-aa4e-39b88c532883\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.634503 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content\") pod \"eaccb5ab-615d-41af-aa4e-39b88c532883\" (UID: \"eaccb5ab-615d-41af-aa4e-39b88c532883\") " Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.637052 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities" (OuterVolumeSpecName: "utilities") pod "eaccb5ab-615d-41af-aa4e-39b88c532883" (UID: "eaccb5ab-615d-41af-aa4e-39b88c532883"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.651003 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx" (OuterVolumeSpecName: "kube-api-access-bh9qx") pod "eaccb5ab-615d-41af-aa4e-39b88c532883" (UID: "eaccb5ab-615d-41af-aa4e-39b88c532883"). InnerVolumeSpecName "kube-api-access-bh9qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.735851 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.735905 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9qx\" (UniqueName: \"kubernetes.io/projected/eaccb5ab-615d-41af-aa4e-39b88c532883-kube-api-access-bh9qx\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.921816 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eaccb5ab-615d-41af-aa4e-39b88c532883" (UID: "eaccb5ab-615d-41af-aa4e-39b88c532883"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:17:06 crc kubenswrapper[4681]: I1007 17:17:06.938426 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaccb5ab-615d-41af-aa4e-39b88c532883-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.171845 4681 generic.go:334] "Generic (PLEG): container finished" podID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerID="a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391" exitCode=0 Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.171908 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerDied","Data":"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391"} Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.171935 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxv78" event={"ID":"eaccb5ab-615d-41af-aa4e-39b88c532883","Type":"ContainerDied","Data":"a93ddb04d8567aef5c78aa8b29bec47bda87c8f1f709e6925ef6d690a823b998"} Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.171953 4681 scope.go:117] "RemoveContainer" containerID="a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.172060 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxv78" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.191962 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.192091 4681 scope.go:117] "RemoveContainer" containerID="9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.200036 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dxv78"] Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.210353 4681 scope.go:117] "RemoveContainer" containerID="0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.229729 4681 scope.go:117] "RemoveContainer" containerID="a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391" Oct 07 17:17:07 crc kubenswrapper[4681]: E1007 17:17:07.230851 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391\": container with ID starting with a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391 not found: ID does not exist" containerID="a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.230939 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391"} err="failed to get container status \"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391\": rpc error: code = NotFound desc = could not find container \"a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391\": container with ID starting with a9aa97a10622c847c3f985a76187a824e539627072c84b11773be41e7f6cc391 not found: ID does not exist" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.230966 4681 scope.go:117] "RemoveContainer" containerID="9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8" Oct 07 17:17:07 crc kubenswrapper[4681]: E1007 17:17:07.231449 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8\": container with ID starting with 9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8 not found: ID does not exist" containerID="9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.231473 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8"} err="failed to get container status \"9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8\": rpc error: code = NotFound desc = could not find container \"9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8\": container with ID starting with 9a93855ad68681e18242d03ca640ded6693581c7d94a6ba621b43dcb36f646e8 not found: ID does not exist" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.231487 4681 scope.go:117] "RemoveContainer" containerID="0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6" Oct 07 17:17:07 crc kubenswrapper[4681]: E1007 17:17:07.231704 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6\": container with ID starting with 0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6 not found: ID does not exist" containerID="0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6" Oct 07 17:17:07 crc kubenswrapper[4681]: I1007 17:17:07.231737 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6"} err="failed to get container status \"0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6\": rpc error: code = NotFound desc = could not find container \"0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6\": container with ID starting with 0a92db06c352f351bf99b88dbd75ff8a98d00fbfc76f155210dc73ba7029b5e6 not found: ID does not exist" Oct 07 17:17:09 crc kubenswrapper[4681]: I1007 17:17:09.038598 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" path="/var/lib/kubelet/pods/eaccb5ab-615d-41af-aa4e-39b88c532883/volumes" Oct 07 17:17:17 crc kubenswrapper[4681]: I1007 17:17:17.448335 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-598476574-wb9sj" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218004 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7f48v"] Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218205 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218215 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218230 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="extract-content" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218235 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="extract-content" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218248 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="extract-utilities" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218254 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="extract-utilities" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218264 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="extract-utilities" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218270 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="extract-utilities" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218279 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="extract-content" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218285 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="extract-content" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.218294 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218300 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218394 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="972f9fea-fcbb-4992-853e-4774ecef09e2" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.218405 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaccb5ab-615d-41af-aa4e-39b88c532883" containerName="registry-server" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.220127 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.222273 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5ckdt" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.222563 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.222613 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.224745 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2"] Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.225360 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.228311 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.237713 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2"] Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.328105 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nzg7f"] Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.329127 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.331505 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.331683 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.331684 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xfhmd" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.335232 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.342575 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-xszcp"] Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.343504 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.345946 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.376300 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xszcp"] Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385661 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385731 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics-certs\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385774 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjk7h\" (UniqueName: \"kubernetes.io/projected/0227af93-e3dc-47c9-b6ce-57d25fc998ea-kube-api-access-kjk7h\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385802 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-startup\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385827 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0227af93-e3dc-47c9-b6ce-57d25fc998ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385851 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-reloader\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.385869 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82q9\" (UniqueName: \"kubernetes.io/projected/45c71c49-f633-48af-a495-a1bdf06d66b9-kube-api-access-j82q9\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.386009 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-sockets\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.386397 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-conf\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487630 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-conf\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487674 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58d6h\" (UniqueName: \"kubernetes.io/projected/3f34c830-b1bc-433a-af20-0db4f0d96394-kube-api-access-58d6h\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487715 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944bc\" (UniqueName: \"kubernetes.io/projected/ec498aeb-7c28-4e30-adee-e4546d01d498-kube-api-access-944bc\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487736 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487758 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487773 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics-certs\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487793 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-cert\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487813 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjk7h\" (UniqueName: \"kubernetes.io/projected/0227af93-e3dc-47c9-b6ce-57d25fc998ea-kube-api-access-kjk7h\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487833 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f34c830-b1bc-433a-af20-0db4f0d96394-metallb-excludel2\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487921 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-startup\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487949 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0227af93-e3dc-47c9-b6ce-57d25fc998ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487974 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-reloader\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.487989 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488006 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j82q9\" (UniqueName: \"kubernetes.io/projected/45c71c49-f633-48af-a495-a1bdf06d66b9-kube-api-access-j82q9\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488023 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-sockets\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488038 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-metrics-certs\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488347 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-conf\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488397 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.488798 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-sockets\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.489175 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/45c71c49-f633-48af-a495-a1bdf06d66b9-reloader\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.489820 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/45c71c49-f633-48af-a495-a1bdf06d66b9-frr-startup\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.494632 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45c71c49-f633-48af-a495-a1bdf06d66b9-metrics-certs\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.506918 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjk7h\" (UniqueName: \"kubernetes.io/projected/0227af93-e3dc-47c9-b6ce-57d25fc998ea-kube-api-access-kjk7h\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.509516 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0227af93-e3dc-47c9-b6ce-57d25fc998ea-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zcsl2\" (UID: \"0227af93-e3dc-47c9-b6ce-57d25fc998ea\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.512373 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82q9\" (UniqueName: \"kubernetes.io/projected/45c71c49-f633-48af-a495-a1bdf06d66b9-kube-api-access-j82q9\") pod \"frr-k8s-7f48v\" (UID: \"45c71c49-f633-48af-a495-a1bdf06d66b9\") " pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.545337 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.557510 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.588909 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58d6h\" (UniqueName: \"kubernetes.io/projected/3f34c830-b1bc-433a-af20-0db4f0d96394-kube-api-access-58d6h\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.588965 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944bc\" (UniqueName: \"kubernetes.io/projected/ec498aeb-7c28-4e30-adee-e4546d01d498-kube-api-access-944bc\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.588988 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.589007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-cert\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.589032 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f34c830-b1bc-433a-af20-0db4f0d96394-metallb-excludel2\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.589066 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.589085 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-metrics-certs\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.589606 4681 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.589659 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist podName:3f34c830-b1bc-433a-af20-0db4f0d96394 nodeName:}" failed. No retries permitted until 2025-10-07 17:17:19.089642119 +0000 UTC m=+842.737053674 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist") pod "speaker-nzg7f" (UID: "3f34c830-b1bc-433a-af20-0db4f0d96394") : secret "metallb-memberlist" not found Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.589950 4681 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 07 17:17:18 crc kubenswrapper[4681]: E1007 17:17:18.589979 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs podName:3f34c830-b1bc-433a-af20-0db4f0d96394 nodeName:}" failed. No retries permitted until 2025-10-07 17:17:19.089971718 +0000 UTC m=+842.737383273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs") pod "speaker-nzg7f" (UID: "3f34c830-b1bc-433a-af20-0db4f0d96394") : secret "speaker-certs-secret" not found Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.590590 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f34c830-b1bc-433a-af20-0db4f0d96394-metallb-excludel2\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.594306 4681 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.594750 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-metrics-certs\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.605246 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec498aeb-7c28-4e30-adee-e4546d01d498-cert\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.611380 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58d6h\" (UniqueName: \"kubernetes.io/projected/3f34c830-b1bc-433a-af20-0db4f0d96394-kube-api-access-58d6h\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.615515 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944bc\" (UniqueName: \"kubernetes.io/projected/ec498aeb-7c28-4e30-adee-e4546d01d498-kube-api-access-944bc\") pod \"controller-68d546b9d8-xszcp\" (UID: \"ec498aeb-7c28-4e30-adee-e4546d01d498\") " pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:18 crc kubenswrapper[4681]: I1007 17:17:18.659557 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.004397 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2"] Oct 07 17:17:19 crc kubenswrapper[4681]: W1007 17:17:19.008510 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0227af93_e3dc_47c9_b6ce_57d25fc998ea.slice/crio-b275156b34be30569ef4d3316967f77d152a6b7c475a8fa337d797d61f45f088 WatchSource:0}: Error finding container b275156b34be30569ef4d3316967f77d152a6b7c475a8fa337d797d61f45f088: Status 404 returned error can't find the container with id b275156b34be30569ef4d3316967f77d152a6b7c475a8fa337d797d61f45f088 Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.066726 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xszcp"] Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.096085 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.096168 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:19 crc kubenswrapper[4681]: E1007 17:17:19.096229 4681 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 17:17:19 crc kubenswrapper[4681]: E1007 17:17:19.096287 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist podName:3f34c830-b1bc-433a-af20-0db4f0d96394 nodeName:}" failed. No retries permitted until 2025-10-07 17:17:20.096273479 +0000 UTC m=+843.743685034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist") pod "speaker-nzg7f" (UID: "3f34c830-b1bc-433a-af20-0db4f0d96394") : secret "metallb-memberlist" not found Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.102187 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-metrics-certs\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.238703 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xszcp" event={"ID":"ec498aeb-7c28-4e30-adee-e4546d01d498","Type":"ContainerStarted","Data":"e93966d6e8baed38e88890fce13b7f01a699894c13b0f64daafb81288f107c49"} Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.238998 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xszcp" event={"ID":"ec498aeb-7c28-4e30-adee-e4546d01d498","Type":"ContainerStarted","Data":"f77cc32fb64358e984d42a0066b1613377295cb6d7aa61f65df4e93168ae3818"} Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.240320 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"d924dfd615becbae8e53f968f775d6c6d2f953785f732baa3f84e3788a2ce937"} Oct 07 17:17:19 crc kubenswrapper[4681]: I1007 17:17:19.241297 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" event={"ID":"0227af93-e3dc-47c9-b6ce-57d25fc998ea","Type":"ContainerStarted","Data":"b275156b34be30569ef4d3316967f77d152a6b7c475a8fa337d797d61f45f088"} Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.108126 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.113549 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f34c830-b1bc-433a-af20-0db4f0d96394-memberlist\") pod \"speaker-nzg7f\" (UID: \"3f34c830-b1bc-433a-af20-0db4f0d96394\") " pod="metallb-system/speaker-nzg7f" Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.147302 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nzg7f" Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.250205 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nzg7f" event={"ID":"3f34c830-b1bc-433a-af20-0db4f0d96394","Type":"ContainerStarted","Data":"e91fada50c8755f8b0b29476a90e97c40eb0bc423e6f3b17d840b3e3775194c3"} Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.251617 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xszcp" event={"ID":"ec498aeb-7c28-4e30-adee-e4546d01d498","Type":"ContainerStarted","Data":"b90491ab7cdfe223bec8ceaf99901a67dff6179f84876ec14cef45992b106b9d"} Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.252791 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:20 crc kubenswrapper[4681]: I1007 17:17:20.285585 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-xszcp" podStartSLOduration=2.285571467 podStartE2EDuration="2.285571467s" podCreationTimestamp="2025-10-07 17:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:17:20.281634688 +0000 UTC m=+843.929046243" watchObservedRunningTime="2025-10-07 17:17:20.285571467 +0000 UTC m=+843.932983012" Oct 07 17:17:21 crc kubenswrapper[4681]: I1007 17:17:21.273632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nzg7f" event={"ID":"3f34c830-b1bc-433a-af20-0db4f0d96394","Type":"ContainerStarted","Data":"11bc067b4b5586bbf8fbdc44eccc3dbeeb795a83be5ef4de33cd41c6c3163ef9"} Oct 07 17:17:21 crc kubenswrapper[4681]: I1007 17:17:21.273681 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nzg7f" event={"ID":"3f34c830-b1bc-433a-af20-0db4f0d96394","Type":"ContainerStarted","Data":"b000069c8bd6e515bd53e35c48fd7f6ae62a8598f38c0bc56dc797efd1c23ded"} Oct 07 17:17:21 crc kubenswrapper[4681]: I1007 17:17:21.299333 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nzg7f" podStartSLOduration=3.299312691 podStartE2EDuration="3.299312691s" podCreationTimestamp="2025-10-07 17:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:17:21.297898654 +0000 UTC m=+844.945310209" watchObservedRunningTime="2025-10-07 17:17:21.299312691 +0000 UTC m=+844.946724246" Oct 07 17:17:22 crc kubenswrapper[4681]: I1007 17:17:22.282474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nzg7f" Oct 07 17:17:27 crc kubenswrapper[4681]: I1007 17:17:27.314251 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" event={"ID":"0227af93-e3dc-47c9-b6ce-57d25fc998ea","Type":"ContainerStarted","Data":"28480c101d222e0ce7ab6e6da1c4f24582820a76779b678c30e0209468b8de88"} Oct 07 17:17:27 crc kubenswrapper[4681]: I1007 17:17:27.314747 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:27 crc kubenswrapper[4681]: I1007 17:17:27.316791 4681 generic.go:334] "Generic (PLEG): container finished" podID="45c71c49-f633-48af-a495-a1bdf06d66b9" containerID="1e76504ce776db7f6c2d84f984e29dd8b5df2d2eca7745fe7b2f8291ffc6b46f" exitCode=0 Oct 07 17:17:27 crc kubenswrapper[4681]: I1007 17:17:27.316944 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerDied","Data":"1e76504ce776db7f6c2d84f984e29dd8b5df2d2eca7745fe7b2f8291ffc6b46f"} Oct 07 17:17:27 crc kubenswrapper[4681]: I1007 17:17:27.330655 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" podStartSLOduration=1.463001899 podStartE2EDuration="9.330635243s" podCreationTimestamp="2025-10-07 17:17:18 +0000 UTC" firstStartedPulling="2025-10-07 17:17:19.01111752 +0000 UTC m=+842.658529075" lastFinishedPulling="2025-10-07 17:17:26.878750864 +0000 UTC m=+850.526162419" observedRunningTime="2025-10-07 17:17:27.326027174 +0000 UTC m=+850.973438739" watchObservedRunningTime="2025-10-07 17:17:27.330635243 +0000 UTC m=+850.978046798" Oct 07 17:17:28 crc kubenswrapper[4681]: I1007 17:17:28.323890 4681 generic.go:334] "Generic (PLEG): container finished" podID="45c71c49-f633-48af-a495-a1bdf06d66b9" containerID="1e67ef22a42891c983b24c4e35e6423d7edba12e8241bccce860a03dbf821cdf" exitCode=0 Oct 07 17:17:28 crc kubenswrapper[4681]: I1007 17:17:28.323997 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerDied","Data":"1e67ef22a42891c983b24c4e35e6423d7edba12e8241bccce860a03dbf821cdf"} Oct 07 17:17:29 crc kubenswrapper[4681]: I1007 17:17:29.330480 4681 generic.go:334] "Generic (PLEG): container finished" podID="45c71c49-f633-48af-a495-a1bdf06d66b9" containerID="90eab1b17bd049670423ebc192df48b3338baf915bf718db9803599ddc8432fa" exitCode=0 Oct 07 17:17:29 crc kubenswrapper[4681]: I1007 17:17:29.331267 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerDied","Data":"90eab1b17bd049670423ebc192df48b3338baf915bf718db9803599ddc8432fa"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.177150 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nzg7f" Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"a86d7b9301e2da7ebf90f04b10b08d1eb034c834c12cddf3719652ed8d0ab21e"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"19ad0b27e9d3d9a9983cc5deaa4853164d5d4e17935be36123ebd60edf142351"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341621 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"5f1096de29d07be29252642af2dcfe17b3eb099c3e57836c5cf26451c0a6127f"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"9dfa40a56b11a1d812fc012be47644253447c00571fb0a574b47f338b17b210b"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341643 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"df2cb05ec71e1fad75996196ba3f37ce7fb0ebabd894d5136bc9c95bc73f1ea5"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341654 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7f48v" event={"ID":"45c71c49-f633-48af-a495-a1bdf06d66b9","Type":"ContainerStarted","Data":"672b726d88750707374ce41fac434a8a6e76511b6231edd8f948688a288bc26c"} Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.341767 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:30 crc kubenswrapper[4681]: I1007 17:17:30.380815 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7f48v" podStartSLOduration=4.168547719 podStartE2EDuration="12.380795504s" podCreationTimestamp="2025-10-07 17:17:18 +0000 UTC" firstStartedPulling="2025-10-07 17:17:18.683999295 +0000 UTC m=+842.331410850" lastFinishedPulling="2025-10-07 17:17:26.89624708 +0000 UTC m=+850.543658635" observedRunningTime="2025-10-07 17:17:30.377392275 +0000 UTC m=+854.024803830" watchObservedRunningTime="2025-10-07 17:17:30.380795504 +0000 UTC m=+854.028207059" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.400827 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.402030 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.404246 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.404278 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8mbpb" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.404577 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.408008 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.502117 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssn52\" (UniqueName: \"kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52\") pod \"openstack-operator-index-t9hxw\" (UID: \"acccca59-29d9-485b-9438-065e380a9043\") " pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.545788 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.589138 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.603454 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssn52\" (UniqueName: \"kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52\") pod \"openstack-operator-index-t9hxw\" (UID: \"acccca59-29d9-485b-9438-065e380a9043\") " pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.627228 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssn52\" (UniqueName: \"kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52\") pod \"openstack-operator-index-t9hxw\" (UID: \"acccca59-29d9-485b-9438-065e380a9043\") " pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:33 crc kubenswrapper[4681]: I1007 17:17:33.724299 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:34 crc kubenswrapper[4681]: I1007 17:17:34.184057 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:34 crc kubenswrapper[4681]: I1007 17:17:34.366771 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9hxw" event={"ID":"acccca59-29d9-485b-9438-065e380a9043","Type":"ContainerStarted","Data":"d0e92a133893760a9901d1747b21f24fa4470762bc7599c55018dc0bd2a1e335"} Oct 07 17:17:36 crc kubenswrapper[4681]: I1007 17:17:36.387239 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9hxw" event={"ID":"acccca59-29d9-485b-9438-065e380a9043","Type":"ContainerStarted","Data":"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a"} Oct 07 17:17:36 crc kubenswrapper[4681]: I1007 17:17:36.401555 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t9hxw" podStartSLOduration=1.414738185 podStartE2EDuration="3.401538921s" podCreationTimestamp="2025-10-07 17:17:33 +0000 UTC" firstStartedPulling="2025-10-07 17:17:34.191018299 +0000 UTC m=+857.838429864" lastFinishedPulling="2025-10-07 17:17:36.177819045 +0000 UTC m=+859.825230600" observedRunningTime="2025-10-07 17:17:36.40033527 +0000 UTC m=+860.047756616" watchObservedRunningTime="2025-10-07 17:17:36.401538921 +0000 UTC m=+860.048950476" Oct 07 17:17:36 crc kubenswrapper[4681]: I1007 17:17:36.776457 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.402119 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w4jqq"] Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.403163 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.409809 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4jqq"] Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.467738 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd6h2\" (UniqueName: \"kubernetes.io/projected/0613f93f-af7c-4a36-8baa-642a076f5666-kube-api-access-jd6h2\") pod \"openstack-operator-index-w4jqq\" (UID: \"0613f93f-af7c-4a36-8baa-642a076f5666\") " pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.569224 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd6h2\" (UniqueName: \"kubernetes.io/projected/0613f93f-af7c-4a36-8baa-642a076f5666-kube-api-access-jd6h2\") pod \"openstack-operator-index-w4jqq\" (UID: \"0613f93f-af7c-4a36-8baa-642a076f5666\") " pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.598190 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd6h2\" (UniqueName: \"kubernetes.io/projected/0613f93f-af7c-4a36-8baa-642a076f5666-kube-api-access-jd6h2\") pod \"openstack-operator-index-w4jqq\" (UID: \"0613f93f-af7c-4a36-8baa-642a076f5666\") " pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:37 crc kubenswrapper[4681]: I1007 17:17:37.738993 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.183454 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4jqq"] Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.402615 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4jqq" event={"ID":"0613f93f-af7c-4a36-8baa-642a076f5666","Type":"ContainerStarted","Data":"ef421435fc346dc37b97d863fa1ec1a3a7794b4100c97622764233fb1890308e"} Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.403216 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-t9hxw" podUID="acccca59-29d9-485b-9438-065e380a9043" containerName="registry-server" containerID="cri-o://519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a" gracePeriod=2 Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.564012 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zcsl2" Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.670828 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-xszcp" Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.802908 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.889119 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssn52\" (UniqueName: \"kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52\") pod \"acccca59-29d9-485b-9438-065e380a9043\" (UID: \"acccca59-29d9-485b-9438-065e380a9043\") " Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.895012 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52" (OuterVolumeSpecName: "kube-api-access-ssn52") pod "acccca59-29d9-485b-9438-065e380a9043" (UID: "acccca59-29d9-485b-9438-065e380a9043"). InnerVolumeSpecName "kube-api-access-ssn52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:17:38 crc kubenswrapper[4681]: I1007 17:17:38.990553 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssn52\" (UniqueName: \"kubernetes.io/projected/acccca59-29d9-485b-9438-065e380a9043-kube-api-access-ssn52\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.409839 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4jqq" event={"ID":"0613f93f-af7c-4a36-8baa-642a076f5666","Type":"ContainerStarted","Data":"2bc0d9c5c0a8c337963204c670041c2b67b37a01a9fb77b32cba779be42768ab"} Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.412817 4681 generic.go:334] "Generic (PLEG): container finished" podID="acccca59-29d9-485b-9438-065e380a9043" containerID="519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a" exitCode=0 Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.412857 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9hxw" event={"ID":"acccca59-29d9-485b-9438-065e380a9043","Type":"ContainerDied","Data":"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a"} Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.412905 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9hxw" event={"ID":"acccca59-29d9-485b-9438-065e380a9043","Type":"ContainerDied","Data":"d0e92a133893760a9901d1747b21f24fa4470762bc7599c55018dc0bd2a1e335"} Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.412927 4681 scope.go:117] "RemoveContainer" containerID="519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.413030 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9hxw" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.426723 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w4jqq" podStartSLOduration=2.3407923139999998 podStartE2EDuration="2.426704781s" podCreationTimestamp="2025-10-07 17:17:37 +0000 UTC" firstStartedPulling="2025-10-07 17:17:38.187164298 +0000 UTC m=+861.834575853" lastFinishedPulling="2025-10-07 17:17:38.273076765 +0000 UTC m=+861.920488320" observedRunningTime="2025-10-07 17:17:39.425713865 +0000 UTC m=+863.073125420" watchObservedRunningTime="2025-10-07 17:17:39.426704781 +0000 UTC m=+863.074116326" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.435241 4681 scope.go:117] "RemoveContainer" containerID="519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a" Oct 07 17:17:39 crc kubenswrapper[4681]: E1007 17:17:39.435677 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a\": container with ID starting with 519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a not found: ID does not exist" containerID="519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.435719 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a"} err="failed to get container status \"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a\": rpc error: code = NotFound desc = could not find container \"519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a\": container with ID starting with 519db9d12ee4e2a67f3aa3fdd5b66ca29666bf95cbd1299996c117226f31168a not found: ID does not exist" Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.441439 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:39 crc kubenswrapper[4681]: I1007 17:17:39.448004 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-t9hxw"] Oct 07 17:17:41 crc kubenswrapper[4681]: I1007 17:17:41.046230 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acccca59-29d9-485b-9438-065e380a9043" path="/var/lib/kubelet/pods/acccca59-29d9-485b-9438-065e380a9043/volumes" Oct 07 17:17:42 crc kubenswrapper[4681]: I1007 17:17:42.194691 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:17:42 crc kubenswrapper[4681]: I1007 17:17:42.194749 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:17:47 crc kubenswrapper[4681]: I1007 17:17:47.740383 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:47 crc kubenswrapper[4681]: I1007 17:17:47.740905 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:47 crc kubenswrapper[4681]: I1007 17:17:47.776845 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:48 crc kubenswrapper[4681]: I1007 17:17:48.479184 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-w4jqq" Oct 07 17:17:48 crc kubenswrapper[4681]: I1007 17:17:48.548393 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7f48v" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.284421 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4"] Oct 07 17:17:55 crc kubenswrapper[4681]: E1007 17:17:55.285128 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acccca59-29d9-485b-9438-065e380a9043" containerName="registry-server" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.285144 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="acccca59-29d9-485b-9438-065e380a9043" containerName="registry-server" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.285268 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="acccca59-29d9-485b-9438-065e380a9043" containerName="registry-server" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.286215 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.288224 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dsh4j" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.319718 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4"] Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.394170 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.394251 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.394278 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v8hr\" (UniqueName: \"kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.495164 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.495229 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.495256 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v8hr\" (UniqueName: \"kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.495790 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.496091 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.519714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v8hr\" (UniqueName: \"kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr\") pod \"886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:55 crc kubenswrapper[4681]: I1007 17:17:55.626025 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:56 crc kubenswrapper[4681]: I1007 17:17:56.015467 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4"] Oct 07 17:17:56 crc kubenswrapper[4681]: W1007 17:17:56.023171 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaccebcc3_c13d_4dab_bb1b_97f95eb370f3.slice/crio-5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4 WatchSource:0}: Error finding container 5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4: Status 404 returned error can't find the container with id 5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4 Oct 07 17:17:56 crc kubenswrapper[4681]: I1007 17:17:56.500690 4681 generic.go:334] "Generic (PLEG): container finished" podID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerID="eb16cc00949c14c234024aadd5afd2f5c1b91b0881be6f99ba937a30b72303a3" exitCode=0 Oct 07 17:17:56 crc kubenswrapper[4681]: I1007 17:17:56.500728 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" event={"ID":"accebcc3-c13d-4dab-bb1b-97f95eb370f3","Type":"ContainerDied","Data":"eb16cc00949c14c234024aadd5afd2f5c1b91b0881be6f99ba937a30b72303a3"} Oct 07 17:17:56 crc kubenswrapper[4681]: I1007 17:17:56.500751 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" event={"ID":"accebcc3-c13d-4dab-bb1b-97f95eb370f3","Type":"ContainerStarted","Data":"5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4"} Oct 07 17:17:57 crc kubenswrapper[4681]: I1007 17:17:57.507455 4681 generic.go:334] "Generic (PLEG): container finished" podID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerID="d5b926789f100e7da7fd0201045248bcd69c8f7d86e69c6dc257c664656f6368" exitCode=0 Oct 07 17:17:57 crc kubenswrapper[4681]: I1007 17:17:57.507547 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" event={"ID":"accebcc3-c13d-4dab-bb1b-97f95eb370f3","Type":"ContainerDied","Data":"d5b926789f100e7da7fd0201045248bcd69c8f7d86e69c6dc257c664656f6368"} Oct 07 17:17:58 crc kubenswrapper[4681]: I1007 17:17:58.517212 4681 generic.go:334] "Generic (PLEG): container finished" podID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerID="f4aece57a633fdf0cc76ba0de1a49ac0bbbed769bf118851ee5c2e31fbfb625f" exitCode=0 Oct 07 17:17:58 crc kubenswrapper[4681]: I1007 17:17:58.517317 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" event={"ID":"accebcc3-c13d-4dab-bb1b-97f95eb370f3","Type":"ContainerDied","Data":"f4aece57a633fdf0cc76ba0de1a49ac0bbbed769bf118851ee5c2e31fbfb625f"} Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.759262 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.859057 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle\") pod \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.859116 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v8hr\" (UniqueName: \"kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr\") pod \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.859155 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util\") pod \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\" (UID: \"accebcc3-c13d-4dab-bb1b-97f95eb370f3\") " Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.860345 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle" (OuterVolumeSpecName: "bundle") pod "accebcc3-c13d-4dab-bb1b-97f95eb370f3" (UID: "accebcc3-c13d-4dab-bb1b-97f95eb370f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.868996 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr" (OuterVolumeSpecName: "kube-api-access-7v8hr") pod "accebcc3-c13d-4dab-bb1b-97f95eb370f3" (UID: "accebcc3-c13d-4dab-bb1b-97f95eb370f3"). InnerVolumeSpecName "kube-api-access-7v8hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.874650 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util" (OuterVolumeSpecName: "util") pod "accebcc3-c13d-4dab-bb1b-97f95eb370f3" (UID: "accebcc3-c13d-4dab-bb1b-97f95eb370f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.960036 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.960074 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v8hr\" (UniqueName: \"kubernetes.io/projected/accebcc3-c13d-4dab-bb1b-97f95eb370f3-kube-api-access-7v8hr\") on node \"crc\" DevicePath \"\"" Oct 07 17:17:59 crc kubenswrapper[4681]: I1007 17:17:59.960090 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/accebcc3-c13d-4dab-bb1b-97f95eb370f3-util\") on node \"crc\" DevicePath \"\"" Oct 07 17:18:00 crc kubenswrapper[4681]: I1007 17:18:00.530615 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" event={"ID":"accebcc3-c13d-4dab-bb1b-97f95eb370f3","Type":"ContainerDied","Data":"5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4"} Oct 07 17:18:00 crc kubenswrapper[4681]: I1007 17:18:00.530671 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4" Oct 07 17:18:00 crc kubenswrapper[4681]: I1007 17:18:00.530684 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d10c7db26a9fb05e30e2c6d03aca5df1c95dd8fcb27502d79772a5eae8cedd4" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.761496 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh"] Oct 07 17:18:07 crc kubenswrapper[4681]: E1007 17:18:07.762203 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="util" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.762215 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="util" Oct 07 17:18:07 crc kubenswrapper[4681]: E1007 17:18:07.762230 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="pull" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.762236 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="pull" Oct 07 17:18:07 crc kubenswrapper[4681]: E1007 17:18:07.762248 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="extract" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.762255 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="extract" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.762378 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="accebcc3-c13d-4dab-bb1b-97f95eb370f3" containerName="extract" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.762936 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.769324 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-rlk7m" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.800902 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh"] Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.854849 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76x9n\" (UniqueName: \"kubernetes.io/projected/9e7a0d41-92ad-4dd7-b836-04c049817f6f-kube-api-access-76x9n\") pod \"openstack-operator-controller-operator-6687d89476-pv9kh\" (UID: \"9e7a0d41-92ad-4dd7-b836-04c049817f6f\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.955922 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76x9n\" (UniqueName: \"kubernetes.io/projected/9e7a0d41-92ad-4dd7-b836-04c049817f6f-kube-api-access-76x9n\") pod \"openstack-operator-controller-operator-6687d89476-pv9kh\" (UID: \"9e7a0d41-92ad-4dd7-b836-04c049817f6f\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:07 crc kubenswrapper[4681]: I1007 17:18:07.974461 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76x9n\" (UniqueName: \"kubernetes.io/projected/9e7a0d41-92ad-4dd7-b836-04c049817f6f-kube-api-access-76x9n\") pod \"openstack-operator-controller-operator-6687d89476-pv9kh\" (UID: \"9e7a0d41-92ad-4dd7-b836-04c049817f6f\") " pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:08 crc kubenswrapper[4681]: I1007 17:18:08.083234 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:08 crc kubenswrapper[4681]: I1007 17:18:08.329634 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh"] Oct 07 17:18:08 crc kubenswrapper[4681]: I1007 17:18:08.584085 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" event={"ID":"9e7a0d41-92ad-4dd7-b836-04c049817f6f","Type":"ContainerStarted","Data":"f89542842b2ee390b1ba0a28fd002ccc27662dc3f01a315c396179969a800d26"} Oct 07 17:18:12 crc kubenswrapper[4681]: I1007 17:18:12.195756 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:18:12 crc kubenswrapper[4681]: I1007 17:18:12.196100 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:18:12 crc kubenswrapper[4681]: I1007 17:18:12.608501 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" event={"ID":"9e7a0d41-92ad-4dd7-b836-04c049817f6f","Type":"ContainerStarted","Data":"3beda6031c1ab30cd4176b8198ffd86f4de10cf4cd34ecc667cc04de5e93ebd2"} Oct 07 17:18:15 crc kubenswrapper[4681]: I1007 17:18:15.631745 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" event={"ID":"9e7a0d41-92ad-4dd7-b836-04c049817f6f","Type":"ContainerStarted","Data":"99119e05294b283c11faf00fb7c75c24375b24f8274f10fe76103bb5b0454b4b"} Oct 07 17:18:15 crc kubenswrapper[4681]: I1007 17:18:15.632058 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:15 crc kubenswrapper[4681]: I1007 17:18:15.667105 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" podStartSLOduration=1.891245449 podStartE2EDuration="8.667088172s" podCreationTimestamp="2025-10-07 17:18:07 +0000 UTC" firstStartedPulling="2025-10-07 17:18:08.343464272 +0000 UTC m=+891.990875837" lastFinishedPulling="2025-10-07 17:18:15.119307005 +0000 UTC m=+898.766718560" observedRunningTime="2025-10-07 17:18:15.663849448 +0000 UTC m=+899.311261003" watchObservedRunningTime="2025-10-07 17:18:15.667088172 +0000 UTC m=+899.314499727" Oct 07 17:18:18 crc kubenswrapper[4681]: I1007 17:18:18.086054 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6687d89476-pv9kh" Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.195371 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.196008 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.196049 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.196556 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.196605 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf" gracePeriod=600 Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.781383 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf" exitCode=0 Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.781689 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf"} Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.781714 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a"} Oct 07 17:18:42 crc kubenswrapper[4681]: I1007 17:18:42.781730 4681 scope.go:117] "RemoveContainer" containerID="c9e051db851240f0bede3ae5fc25fdd2610a6d8f3198a352363e8b66b292625b" Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.963263 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7"] Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.965131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.972644 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-d6hkg" Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.980051 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7"] Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.984373 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf"] Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.985315 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:18:52 crc kubenswrapper[4681]: I1007 17:18:52.987998 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tccnt" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.010223 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.011366 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.012890 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s47ws" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.020329 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.040475 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.049460 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dffll\" (UniqueName: \"kubernetes.io/projected/fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3-kube-api-access-dffll\") pod \"barbican-operator-controller-manager-58c4cd55f4-dhcz7\" (UID: \"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.049513 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdthl\" (UniqueName: \"kubernetes.io/projected/72f6dfae-3a77-46ad-874b-c94d9059566c-kube-api-access-gdthl\") pod \"designate-operator-controller-manager-75dfd9b554-9qrr7\" (UID: \"72f6dfae-3a77-46ad-874b-c94d9059566c\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.049583 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvzs\" (UniqueName: \"kubernetes.io/projected/c7125c26-53ab-471e-bf33-05265e3f571a-kube-api-access-zbvzs\") pod \"cinder-operator-controller-manager-7d4d4f8d-c8nzf\" (UID: \"c7125c26-53ab-471e-bf33-05265e3f571a\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.054922 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.055840 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.058540 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qb72f" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.066400 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.067390 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.068855 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-s9cp7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.079460 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.080380 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.083379 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-hdb29" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.091549 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.104580 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.122406 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.123389 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.127327 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.127338 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7vmrs" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.133077 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.142815 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150445 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84bm\" (UniqueName: \"kubernetes.io/projected/c3da478d-c5f4-473c-9848-740845c9adf1-kube-api-access-l84bm\") pod \"heat-operator-controller-manager-54b4974c45-98wq4\" (UID: \"c3da478d-c5f4-473c-9848-740845c9adf1\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150496 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvzs\" (UniqueName: \"kubernetes.io/projected/c7125c26-53ab-471e-bf33-05265e3f571a-kube-api-access-zbvzs\") pod \"cinder-operator-controller-manager-7d4d4f8d-c8nzf\" (UID: \"c7125c26-53ab-471e-bf33-05265e3f571a\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dffll\" (UniqueName: \"kubernetes.io/projected/fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3-kube-api-access-dffll\") pod \"barbican-operator-controller-manager-58c4cd55f4-dhcz7\" (UID: \"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150548 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150571 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tml7c\" (UniqueName: \"kubernetes.io/projected/049764d0-d62e-4553-9628-3d1b7258d126-kube-api-access-tml7c\") pod \"glance-operator-controller-manager-5dc44df7d5-rhvt8\" (UID: \"049764d0-d62e-4553-9628-3d1b7258d126\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150595 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdthl\" (UniqueName: \"kubernetes.io/projected/72f6dfae-3a77-46ad-874b-c94d9059566c-kube-api-access-gdthl\") pod \"designate-operator-controller-manager-75dfd9b554-9qrr7\" (UID: \"72f6dfae-3a77-46ad-874b-c94d9059566c\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150633 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9x78\" (UniqueName: \"kubernetes.io/projected/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-kube-api-access-r9x78\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.150659 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2jj7\" (UniqueName: \"kubernetes.io/projected/8e8c5ada-0313-4a16-b9cd-17d39ce932ca-kube-api-access-h2jj7\") pod \"horizon-operator-controller-manager-76d5b87f47-8qj4n\" (UID: \"8e8c5ada-0313-4a16-b9cd-17d39ce932ca\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.153704 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.154908 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.162792 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s7smc" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.162934 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.163912 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.176245 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9wdtn" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.185439 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvzs\" (UniqueName: \"kubernetes.io/projected/c7125c26-53ab-471e-bf33-05265e3f571a-kube-api-access-zbvzs\") pod \"cinder-operator-controller-manager-7d4d4f8d-c8nzf\" (UID: \"c7125c26-53ab-471e-bf33-05265e3f571a\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.196975 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.200461 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdthl\" (UniqueName: \"kubernetes.io/projected/72f6dfae-3a77-46ad-874b-c94d9059566c-kube-api-access-gdthl\") pod \"designate-operator-controller-manager-75dfd9b554-9qrr7\" (UID: \"72f6dfae-3a77-46ad-874b-c94d9059566c\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.207285 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.210412 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.210710 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dffll\" (UniqueName: \"kubernetes.io/projected/fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3-kube-api-access-dffll\") pod \"barbican-operator-controller-manager-58c4cd55f4-dhcz7\" (UID: \"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.211399 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.219337 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7d4sn" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.221280 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.230384 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.231640 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.243007 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-df5w9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251547 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251605 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tml7c\" (UniqueName: \"kubernetes.io/projected/049764d0-d62e-4553-9628-3d1b7258d126-kube-api-access-tml7c\") pod \"glance-operator-controller-manager-5dc44df7d5-rhvt8\" (UID: \"049764d0-d62e-4553-9628-3d1b7258d126\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251631 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vws\" (UniqueName: \"kubernetes.io/projected/720e687c-21aa-4f31-bc4f-7be0f836ec16-kube-api-access-c4vws\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8\" (UID: \"720e687c-21aa-4f31-bc4f-7be0f836ec16\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251651 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pk8\" (UniqueName: \"kubernetes.io/projected/bd602c09-19c5-45a7-b8fa-4202e147bbf9-kube-api-access-s9pk8\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8m6q6\" (UID: \"bd602c09-19c5-45a7-b8fa-4202e147bbf9\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251693 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9x78\" (UniqueName: \"kubernetes.io/projected/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-kube-api-access-r9x78\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251716 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s54xb\" (UniqueName: \"kubernetes.io/projected/049a3d2e-6274-44c0-8b56-d19e8d8b1cfc-kube-api-access-s54xb\") pod \"ironic-operator-controller-manager-649675d675-mt5xr\" (UID: \"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251735 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2jj7\" (UniqueName: \"kubernetes.io/projected/8e8c5ada-0313-4a16-b9cd-17d39ce932ca-kube-api-access-h2jj7\") pod \"horizon-operator-controller-manager-76d5b87f47-8qj4n\" (UID: \"8e8c5ada-0313-4a16-b9cd-17d39ce932ca\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251762 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd9f6\" (UniqueName: \"kubernetes.io/projected/5cc0eff1-427a-4489-8957-f5148e6a0630-kube-api-access-wd9f6\") pod \"manila-operator-controller-manager-65d89cfd9f-spqlq\" (UID: \"5cc0eff1-427a-4489-8957-f5148e6a0630\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.251782 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84bm\" (UniqueName: \"kubernetes.io/projected/c3da478d-c5f4-473c-9848-740845c9adf1-kube-api-access-l84bm\") pod \"heat-operator-controller-manager-54b4974c45-98wq4\" (UID: \"c3da478d-c5f4-473c-9848-740845c9adf1\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.252150 4681 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.252199 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert podName:f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4 nodeName:}" failed. No retries permitted until 2025-10-07 17:18:53.752184225 +0000 UTC m=+937.399595780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert") pod "infra-operator-controller-manager-658588b8c9-9fwcg" (UID: "f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4") : secret "infra-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.261219 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.262116 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.274709 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mljpn" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.281536 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84bm\" (UniqueName: \"kubernetes.io/projected/c3da478d-c5f4-473c-9848-740845c9adf1-kube-api-access-l84bm\") pod \"heat-operator-controller-manager-54b4974c45-98wq4\" (UID: \"c3da478d-c5f4-473c-9848-740845c9adf1\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.281973 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2jj7\" (UniqueName: \"kubernetes.io/projected/8e8c5ada-0313-4a16-b9cd-17d39ce932ca-kube-api-access-h2jj7\") pod \"horizon-operator-controller-manager-76d5b87f47-8qj4n\" (UID: \"8e8c5ada-0313-4a16-b9cd-17d39ce932ca\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.284175 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.292638 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9x78\" (UniqueName: \"kubernetes.io/projected/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-kube-api-access-r9x78\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.298009 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tml7c\" (UniqueName: \"kubernetes.io/projected/049764d0-d62e-4553-9628-3d1b7258d126-kube-api-access-tml7c\") pod \"glance-operator-controller-manager-5dc44df7d5-rhvt8\" (UID: \"049764d0-d62e-4553-9628-3d1b7258d126\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.316083 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.321378 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.321985 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.337186 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.382942 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.383791 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.384327 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.384685 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.389933 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pk8\" (UniqueName: \"kubernetes.io/projected/bd602c09-19c5-45a7-b8fa-4202e147bbf9-kube-api-access-s9pk8\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8m6q6\" (UID: \"bd602c09-19c5-45a7-b8fa-4202e147bbf9\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.390128 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s54xb\" (UniqueName: \"kubernetes.io/projected/049a3d2e-6274-44c0-8b56-d19e8d8b1cfc-kube-api-access-s54xb\") pod \"ironic-operator-controller-manager-649675d675-mt5xr\" (UID: \"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.390260 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd9f6\" (UniqueName: \"kubernetes.io/projected/5cc0eff1-427a-4489-8957-f5148e6a0630-kube-api-access-wd9f6\") pod \"manila-operator-controller-manager-65d89cfd9f-spqlq\" (UID: \"5cc0eff1-427a-4489-8957-f5148e6a0630\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.390417 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vws\" (UniqueName: \"kubernetes.io/projected/720e687c-21aa-4f31-bc4f-7be0f836ec16-kube-api-access-c4vws\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8\" (UID: \"720e687c-21aa-4f31-bc4f-7be0f836ec16\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.390542 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz9l\" (UniqueName: \"kubernetes.io/projected/52146033-65f3-42f4-b0b8-2b550445305f-kube-api-access-mtz9l\") pod \"neutron-operator-controller-manager-8d984cc4d-plkk5\" (UID: \"52146033-65f3-42f4-b0b8-2b550445305f\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.407444 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.416685 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2x8vn" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.435704 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd9f6\" (UniqueName: \"kubernetes.io/projected/5cc0eff1-427a-4489-8957-f5148e6a0630-kube-api-access-wd9f6\") pod \"manila-operator-controller-manager-65d89cfd9f-spqlq\" (UID: \"5cc0eff1-427a-4489-8957-f5148e6a0630\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.438933 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pk8\" (UniqueName: \"kubernetes.io/projected/bd602c09-19c5-45a7-b8fa-4202e147bbf9-kube-api-access-s9pk8\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8m6q6\" (UID: \"bd602c09-19c5-45a7-b8fa-4202e147bbf9\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.440547 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vws\" (UniqueName: \"kubernetes.io/projected/720e687c-21aa-4f31-bc4f-7be0f836ec16-kube-api-access-c4vws\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8\" (UID: \"720e687c-21aa-4f31-bc4f-7be0f836ec16\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.459908 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.461026 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.465787 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-72jvj" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.466795 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s54xb\" (UniqueName: \"kubernetes.io/projected/049a3d2e-6274-44c0-8b56-d19e8d8b1cfc-kube-api-access-s54xb\") pod \"ironic-operator-controller-manager-649675d675-mt5xr\" (UID: \"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.490247 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.491814 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf56d\" (UniqueName: \"kubernetes.io/projected/c318e2b6-9014-471c-b54d-de14e50a1dfe-kube-api-access-wf56d\") pod \"octavia-operator-controller-manager-7468f855d8-t75k9\" (UID: \"c318e2b6-9014-471c-b54d-de14e50a1dfe\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.491991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz9l\" (UniqueName: \"kubernetes.io/projected/52146033-65f3-42f4-b0b8-2b550445305f-kube-api-access-mtz9l\") pod \"neutron-operator-controller-manager-8d984cc4d-plkk5\" (UID: \"52146033-65f3-42f4-b0b8-2b550445305f\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.492019 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlfl\" (UniqueName: \"kubernetes.io/projected/9c9bc247-6ea6-486c-956c-292930b2c111-kube-api-access-twlfl\") pod \"nova-operator-controller-manager-7c7fc454ff-vh9d7\" (UID: \"9c9bc247-6ea6-486c-956c-292930b2c111\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.509867 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.522828 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.523719 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.524784 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.531034 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-d69t6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.531637 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tg4rj" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.531778 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.538086 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.544931 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz9l\" (UniqueName: \"kubernetes.io/projected/52146033-65f3-42f4-b0b8-2b550445305f-kube-api-access-mtz9l\") pod \"neutron-operator-controller-manager-8d984cc4d-plkk5\" (UID: \"52146033-65f3-42f4-b0b8-2b550445305f\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.553269 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.564999 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.587061 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-688l4"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.588212 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.588424 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.591643 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-g88g8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.595764 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlfl\" (UniqueName: \"kubernetes.io/projected/9c9bc247-6ea6-486c-956c-292930b2c111-kube-api-access-twlfl\") pod \"nova-operator-controller-manager-7c7fc454ff-vh9d7\" (UID: \"9c9bc247-6ea6-486c-956c-292930b2c111\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.595840 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.595869 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2p2\" (UniqueName: \"kubernetes.io/projected/a7636f86-a942-4f89-bc80-01a3ce70c13e-kube-api-access-pr2p2\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.595928 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf56d\" (UniqueName: \"kubernetes.io/projected/c318e2b6-9014-471c-b54d-de14e50a1dfe-kube-api-access-wf56d\") pod \"octavia-operator-controller-manager-7468f855d8-t75k9\" (UID: \"c318e2b6-9014-471c-b54d-de14e50a1dfe\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.595972 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszpv\" (UniqueName: \"kubernetes.io/projected/10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28-kube-api-access-tszpv\") pod \"ovn-operator-controller-manager-6d8b6f9b9-v7gh2\" (UID: \"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.613641 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.656228 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf56d\" (UniqueName: \"kubernetes.io/projected/c318e2b6-9014-471c-b54d-de14e50a1dfe-kube-api-access-wf56d\") pod \"octavia-operator-controller-manager-7468f855d8-t75k9\" (UID: \"c318e2b6-9014-471c-b54d-de14e50a1dfe\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.657290 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-688l4"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.661275 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.661940 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlfl\" (UniqueName: \"kubernetes.io/projected/9c9bc247-6ea6-486c-956c-292930b2c111-kube-api-access-twlfl\") pod \"nova-operator-controller-manager-7c7fc454ff-vh9d7\" (UID: \"9c9bc247-6ea6-486c-956c-292930b2c111\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.665993 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.670818 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-545jm" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.680345 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.683023 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.686178 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mqrv4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.690121 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.691305 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.697390 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszpv\" (UniqueName: \"kubernetes.io/projected/10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28-kube-api-access-tszpv\") pod \"ovn-operator-controller-manager-6d8b6f9b9-v7gh2\" (UID: \"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.697614 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.697667 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7t9\" (UniqueName: \"kubernetes.io/projected/df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4-kube-api-access-9z7t9\") pod \"swift-operator-controller-manager-6859f9b676-bxvlr\" (UID: \"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.697708 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2p2\" (UniqueName: \"kubernetes.io/projected/a7636f86-a942-4f89-bc80-01a3ce70c13e-kube-api-access-pr2p2\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.697778 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc44p\" (UniqueName: \"kubernetes.io/projected/97a43b61-b120-4613-9b2a-603e1d90878a-kube-api-access-tc44p\") pod \"placement-operator-controller-manager-54689d9f88-688l4\" (UID: \"97a43b61-b120-4613-9b2a-603e1d90878a\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.698337 4681 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.704673 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert podName:a7636f86-a942-4f89-bc80-01a3ce70c13e nodeName:}" failed. No retries permitted until 2025-10-07 17:18:54.198363215 +0000 UTC m=+937.845774770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" (UID: "a7636f86-a942-4f89-bc80-01a3ce70c13e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.709408 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.713686 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-54frv" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.714143 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.726895 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszpv\" (UniqueName: \"kubernetes.io/projected/10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28-kube-api-access-tszpv\") pod \"ovn-operator-controller-manager-6d8b6f9b9-v7gh2\" (UID: \"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.727166 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.734819 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.735314 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.758093 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2p2\" (UniqueName: \"kubernetes.io/projected/a7636f86-a942-4f89-bc80-01a3ce70c13e-kube-api-access-pr2p2\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.777778 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.799530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc44p\" (UniqueName: \"kubernetes.io/projected/97a43b61-b120-4613-9b2a-603e1d90878a-kube-api-access-tc44p\") pod \"placement-operator-controller-manager-54689d9f88-688l4\" (UID: \"97a43b61-b120-4613-9b2a-603e1d90878a\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.799584 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchc5\" (UniqueName: \"kubernetes.io/projected/be78a905-7f1e-4ea1-baf4-5f84246df65f-kube-api-access-lchc5\") pod \"test-operator-controller-manager-5cd5cb47d7-svd7g\" (UID: \"be78a905-7f1e-4ea1-baf4-5f84246df65f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.799621 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.801124 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7t9\" (UniqueName: \"kubernetes.io/projected/df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4-kube-api-access-9z7t9\") pod \"swift-operator-controller-manager-6859f9b676-bxvlr\" (UID: \"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.801156 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdgs\" (UniqueName: \"kubernetes.io/projected/1a3899b1-53e5-413b-b1c1-c7d2f2274b75-kube-api-access-5kdgs\") pod \"telemetry-operator-controller-manager-5d4d74dd89-zhppp\" (UID: \"1a3899b1-53e5-413b-b1c1-c7d2f2274b75\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.801492 4681 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: E1007 17:18:53.801529 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert podName:f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4 nodeName:}" failed. No retries permitted until 2025-10-07 17:18:54.801514768 +0000 UTC m=+938.448926323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert") pod "infra-operator-controller-manager-658588b8c9-9fwcg" (UID: "f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4") : secret "infra-operator-webhook-server-cert" not found Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.822383 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.830915 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.832050 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.833322 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.846682 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6mwgv" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.848960 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j"] Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.855802 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.876786 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7t9\" (UniqueName: \"kubernetes.io/projected/df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4-kube-api-access-9z7t9\") pod \"swift-operator-controller-manager-6859f9b676-bxvlr\" (UID: \"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.877372 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc44p\" (UniqueName: \"kubernetes.io/projected/97a43b61-b120-4613-9b2a-603e1d90878a-kube-api-access-tc44p\") pod \"placement-operator-controller-manager-54689d9f88-688l4\" (UID: \"97a43b61-b120-4613-9b2a-603e1d90878a\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.905965 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdgs\" (UniqueName: \"kubernetes.io/projected/1a3899b1-53e5-413b-b1c1-c7d2f2274b75-kube-api-access-5kdgs\") pod \"telemetry-operator-controller-manager-5d4d74dd89-zhppp\" (UID: \"1a3899b1-53e5-413b-b1c1-c7d2f2274b75\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.906027 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchc5\" (UniqueName: \"kubernetes.io/projected/be78a905-7f1e-4ea1-baf4-5f84246df65f-kube-api-access-lchc5\") pod \"test-operator-controller-manager-5cd5cb47d7-svd7g\" (UID: \"be78a905-7f1e-4ea1-baf4-5f84246df65f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.906074 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mprf\" (UniqueName: \"kubernetes.io/projected/6e4f29f4-5ec2-4476-9153-954cc984443f-kube-api-access-8mprf\") pod \"watcher-operator-controller-manager-6cbc6dd547-gn84j\" (UID: \"6e4f29f4-5ec2-4476-9153-954cc984443f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.941453 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdgs\" (UniqueName: \"kubernetes.io/projected/1a3899b1-53e5-413b-b1c1-c7d2f2274b75-kube-api-access-5kdgs\") pod \"telemetry-operator-controller-manager-5d4d74dd89-zhppp\" (UID: \"1a3899b1-53e5-413b-b1c1-c7d2f2274b75\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.944849 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchc5\" (UniqueName: \"kubernetes.io/projected/be78a905-7f1e-4ea1-baf4-5f84246df65f-kube-api-access-lchc5\") pod \"test-operator-controller-manager-5cd5cb47d7-svd7g\" (UID: \"be78a905-7f1e-4ea1-baf4-5f84246df65f\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.965357 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:18:53 crc kubenswrapper[4681]: I1007 17:18:53.985050 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.004801 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.004838 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.018941 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.021902 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mprf\" (UniqueName: \"kubernetes.io/projected/6e4f29f4-5ec2-4476-9153-954cc984443f-kube-api-access-8mprf\") pod \"watcher-operator-controller-manager-6cbc6dd547-gn84j\" (UID: \"6e4f29f4-5ec2-4476-9153-954cc984443f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.036200 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rxcs7" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.036607 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.047214 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.056577 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mprf\" (UniqueName: \"kubernetes.io/projected/6e4f29f4-5ec2-4476-9153-954cc984443f-kube-api-access-8mprf\") pod \"watcher-operator-controller-manager-6cbc6dd547-gn84j\" (UID: \"6e4f29f4-5ec2-4476-9153-954cc984443f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.057869 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.058635 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.065974 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ds5x4" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.093870 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.126635 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.126747 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dsst\" (UniqueName: \"kubernetes.io/projected/66b31094-5895-41aa-a268-fd2d13990f9f-kube-api-access-5dsst\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.126788 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwf7\" (UniqueName: \"kubernetes.io/projected/357d30fc-7c29-4bea-a20a-926b5723bcb0-kube-api-access-7nwf7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv\" (UID: \"357d30fc-7c29-4bea-a20a-926b5723bcb0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.174410 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.193440 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.242699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.242754 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.242802 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dsst\" (UniqueName: \"kubernetes.io/projected/66b31094-5895-41aa-a268-fd2d13990f9f-kube-api-access-5dsst\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.242838 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwf7\" (UniqueName: \"kubernetes.io/projected/357d30fc-7c29-4bea-a20a-926b5723bcb0-kube-api-access-7nwf7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv\" (UID: \"357d30fc-7c29-4bea-a20a-926b5723bcb0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.243133 4681 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.243219 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert podName:a7636f86-a942-4f89-bc80-01a3ce70c13e nodeName:}" failed. No retries permitted until 2025-10-07 17:18:55.243193091 +0000 UTC m=+938.890604646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" (UID: "a7636f86-a942-4f89-bc80-01a3ce70c13e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.243293 4681 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.243390 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert podName:66b31094-5895-41aa-a268-fd2d13990f9f nodeName:}" failed. No retries permitted until 2025-10-07 17:18:54.743367426 +0000 UTC m=+938.390778981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert") pod "openstack-operator-controller-manager-77dffbdc98-vqctw" (UID: "66b31094-5895-41aa-a268-fd2d13990f9f") : secret "webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.273866 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwf7\" (UniqueName: \"kubernetes.io/projected/357d30fc-7c29-4bea-a20a-926b5723bcb0-kube-api-access-7nwf7\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv\" (UID: \"357d30fc-7c29-4bea-a20a-926b5723bcb0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.274492 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dsst\" (UniqueName: \"kubernetes.io/projected/66b31094-5895-41aa-a268-fd2d13990f9f-kube-api-access-5dsst\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.279169 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf"] Oct 07 17:18:54 crc kubenswrapper[4681]: W1007 17:18:54.311488 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7125c26_53ab_471e_bf33_05265e3f571a.slice/crio-e72a5e93c577562e9f51e5676c3c6097a8a23f9946b9c89e49722bdb696be8ba WatchSource:0}: Error finding container e72a5e93c577562e9f51e5676c3c6097a8a23f9946b9c89e49722bdb696be8ba: Status 404 returned error can't find the container with id e72a5e93c577562e9f51e5676c3c6097a8a23f9946b9c89e49722bdb696be8ba Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.399667 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.410570 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.680153 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.749136 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8"] Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.780540 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.780739 4681 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: E1007 17:18:54.780787 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert podName:66b31094-5895-41aa-a268-fd2d13990f9f nodeName:}" failed. No retries permitted until 2025-10-07 17:18:55.780770886 +0000 UTC m=+939.428182441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert") pod "openstack-operator-controller-manager-77dffbdc98-vqctw" (UID: "66b31094-5895-41aa-a268-fd2d13990f9f") : secret "webhook-server-cert" not found Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.882184 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.894584 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4-cert\") pod \"infra-operator-controller-manager-658588b8c9-9fwcg\" (UID: \"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.924202 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" event={"ID":"72f6dfae-3a77-46ad-874b-c94d9059566c","Type":"ContainerStarted","Data":"9af75f03f79866fb6b7ea123feeec58f472125295905f4288da1af6fb90643bb"} Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.925457 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" event={"ID":"049764d0-d62e-4553-9628-3d1b7258d126","Type":"ContainerStarted","Data":"7690a0a4b6b3e5703590fcd4ab3b9a61848291991cdea4f5300b1fb6fea5f24e"} Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.926621 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" event={"ID":"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3","Type":"ContainerStarted","Data":"569519aebf788bf21fd6237424e65543c78a832ea951d27ed0a24769a1e8fb99"} Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.927987 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" event={"ID":"c7125c26-53ab-471e-bf33-05265e3f571a","Type":"ContainerStarted","Data":"e72a5e93c577562e9f51e5676c3c6097a8a23f9946b9c89e49722bdb696be8ba"} Oct 07 17:18:54 crc kubenswrapper[4681]: I1007 17:18:54.967032 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.015663 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.062244 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.067850 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.083471 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.087125 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.170569 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.190641 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.191099 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9"] Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.215967 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc318e2b6_9014_471c_b54d_de14e50a1dfe.slice/crio-d99770f02a7463850b3c50281e35eb4574153dd6c77bbc62d173551a2369348d WatchSource:0}: Error finding container d99770f02a7463850b3c50281e35eb4574153dd6c77bbc62d173551a2369348d: Status 404 returned error can't find the container with id d99770f02a7463850b3c50281e35eb4574153dd6c77bbc62d173551a2369348d Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.269741 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-688l4"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.297979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.313431 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7636f86-a942-4f89-bc80-01a3ce70c13e-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665clkrns\" (UID: \"a7636f86-a942-4f89-bc80-01a3ce70c13e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.316232 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a43b61_b120_4613_9b2a_603e1d90878a.slice/crio-14493edb77e2e5e446c405aa1cc043801154722d68475c08866370909d47d4fe WatchSource:0}: Error finding container 14493edb77e2e5e446c405aa1cc043801154722d68475c08866370909d47d4fe: Status 404 returned error can't find the container with id 14493edb77e2e5e446c405aa1cc043801154722d68475c08866370909d47d4fe Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.325559 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.351364 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.357445 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.383810 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.383938 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5kdgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-zhppp_openstack-operators(1a3899b1-53e5-413b-b1c1-c7d2f2274b75): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.382100 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9bc247_6ea6_486c_956c_292930b2c111.slice/crio-62062cfe6ef27de69d427d6c1e0ed29d8cd7924adc59a85a6f47695d3812dd71 WatchSource:0}: Error finding container 62062cfe6ef27de69d427d6c1e0ed29d8cd7924adc59a85a6f47695d3812dd71: Status 404 returned error can't find the container with id 62062cfe6ef27de69d427d6c1e0ed29d8cd7924adc59a85a6f47695d3812dd71 Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.388349 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-vh9d7_openstack-operators(9c9bc247-6ea6-486c-956c-292930b2c111): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.469631 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.476641 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.482514 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j"] Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.486771 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv"] Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.499324 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe78a905_7f1e_4ea1_baf4_5f84246df65f.slice/crio-893a0fa480f5da2fe65d07bec88b07de64322ef66aec506a370450b0e64f7be3 WatchSource:0}: Error finding container 893a0fa480f5da2fe65d07bec88b07de64322ef66aec506a370450b0e64f7be3: Status 404 returned error can't find the container with id 893a0fa480f5da2fe65d07bec88b07de64322ef66aec506a370450b0e64f7be3 Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.508420 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2c4c3b_cd2b_487e_bef4_071d2c9f0eb4.slice/crio-4b68cab14e35db0dd43ce5c9c779501df69b271a1ce49a28e56892e1426103f2 WatchSource:0}: Error finding container 4b68cab14e35db0dd43ce5c9c779501df69b271a1ce49a28e56892e1426103f2: Status 404 returned error can't find the container with id 4b68cab14e35db0dd43ce5c9c779501df69b271a1ce49a28e56892e1426103f2 Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.512856 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357d30fc_7c29_4bea_a20a_926b5723bcb0.slice/crio-e14474e4a5e3267016cc140d7f484dcad9fcf8a3c8cb87f2fbcbe6b798530698 WatchSource:0}: Error finding container e14474e4a5e3267016cc140d7f484dcad9fcf8a3c8cb87f2fbcbe6b798530698: Status 404 returned error can't find the container with id e14474e4a5e3267016cc140d7f484dcad9fcf8a3c8cb87f2fbcbe6b798530698 Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.513322 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9z7t9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-bxvlr_openstack-operators(df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.517180 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e4f29f4_5ec2_4476_9153_954cc984443f.slice/crio-9201fb6c58301fcf492beef8883aa0fd8443739742168f21965813dfd35b721e WatchSource:0}: Error finding container 9201fb6c58301fcf492beef8883aa0fd8443739742168f21965813dfd35b721e: Status 404 returned error can't find the container with id 9201fb6c58301fcf492beef8883aa0fd8443739742168f21965813dfd35b721e Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.518261 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7nwf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv_openstack-operators(357d30fc-7c29-4bea-a20a-926b5723bcb0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.519335 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" podUID="357d30fc-7c29-4bea-a20a-926b5723bcb0" Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.521428 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mprf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-gn84j_openstack-operators(6e4f29f4-5ec2-4476-9153-954cc984443f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.608603 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" podUID="1a3899b1-53e5-413b-b1c1-c7d2f2274b75" Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.663015 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" podUID="9c9bc247-6ea6-486c-956c-292930b2c111" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.702253 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg"] Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.715998 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" podUID="df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4" Oct 07 17:18:55 crc kubenswrapper[4681]: W1007 17:18:55.730326 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f3e3d1_15b0_46ed_8fbb_bc57c0f4fdf4.slice/crio-fc17bb8485c3b92ba0c46fc7786491d5f823779573d70ebe3c0e2be0eeb5c2f8 WatchSource:0}: Error finding container fc17bb8485c3b92ba0c46fc7786491d5f823779573d70ebe3c0e2be0eeb5c2f8: Status 404 returned error can't find the container with id fc17bb8485c3b92ba0c46fc7786491d5f823779573d70ebe3c0e2be0eeb5c2f8 Oct 07 17:18:55 crc kubenswrapper[4681]: E1007 17:18:55.770384 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" podUID="6e4f29f4-5ec2-4476-9153-954cc984443f" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.809152 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.814239 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66b31094-5895-41aa-a268-fd2d13990f9f-cert\") pod \"openstack-operator-controller-manager-77dffbdc98-vqctw\" (UID: \"66b31094-5895-41aa-a268-fd2d13990f9f\") " pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.878848 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.977082 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" event={"ID":"5cc0eff1-427a-4489-8957-f5148e6a0630","Type":"ContainerStarted","Data":"ecb9af2efc93aee2d6d946aab119227c1e44a419dd07ccbe1ba646ba6e1d2f52"} Oct 07 17:18:55 crc kubenswrapper[4681]: I1007 17:18:55.989152 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns"] Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.016058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" event={"ID":"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4","Type":"ContainerStarted","Data":"77e9f9b4b58d078a08df1cacd88f34ab077a7fb9a3e4238dee43bf45aaac6b4f"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.016110 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" event={"ID":"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4","Type":"ContainerStarted","Data":"4b68cab14e35db0dd43ce5c9c779501df69b271a1ce49a28e56892e1426103f2"} Oct 07 17:18:56 crc kubenswrapper[4681]: E1007 17:18:56.018129 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" podUID="df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4" Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.039052 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" event={"ID":"be78a905-7f1e-4ea1-baf4-5f84246df65f","Type":"ContainerStarted","Data":"893a0fa480f5da2fe65d07bec88b07de64322ef66aec506a370450b0e64f7be3"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.054805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" event={"ID":"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc","Type":"ContainerStarted","Data":"f0d4db7bd6e7f9e3b0324b45dd7b7ce98e8816ab906e6fdc5e3c49cc9a812cde"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.061414 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" event={"ID":"97a43b61-b120-4613-9b2a-603e1d90878a","Type":"ContainerStarted","Data":"14493edb77e2e5e446c405aa1cc043801154722d68475c08866370909d47d4fe"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.065268 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" event={"ID":"1a3899b1-53e5-413b-b1c1-c7d2f2274b75","Type":"ContainerStarted","Data":"30d9c8998683d45fbea5f963afb96c958c4ef85635666d884b3558cd8fd2d440"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.065302 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" event={"ID":"1a3899b1-53e5-413b-b1c1-c7d2f2274b75","Type":"ContainerStarted","Data":"16618b4c105d7cbcfdd79b2d7e80f1d3546b890dcd6ec2a9ec94682c96f98a5b"} Oct 07 17:18:56 crc kubenswrapper[4681]: E1007 17:18:56.087472 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" podUID="1a3899b1-53e5-413b-b1c1-c7d2f2274b75" Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.110032 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" event={"ID":"8e8c5ada-0313-4a16-b9cd-17d39ce932ca","Type":"ContainerStarted","Data":"aa7137fe685f9cd9e71ae7db6b4adf1df62579d248333e745a93dd5710fef657"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.132160 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" event={"ID":"bd602c09-19c5-45a7-b8fa-4202e147bbf9","Type":"ContainerStarted","Data":"83f39213ebd1aee1553f5a50f8ed6a692c04054930960a5b5e09420f5eee8384"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.175271 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" event={"ID":"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28","Type":"ContainerStarted","Data":"10881c4fc5a64ee894ac83d2ddb0c397bb76d7feb8ab2aa5c36f791bade4d420"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.219047 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" event={"ID":"52146033-65f3-42f4-b0b8-2b550445305f","Type":"ContainerStarted","Data":"246c61cc7d4ea4bd810b46712a0aca721ef88811f3a4b95fea3cc43bf3dcd976"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.221825 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" event={"ID":"c318e2b6-9014-471c-b54d-de14e50a1dfe","Type":"ContainerStarted","Data":"d99770f02a7463850b3c50281e35eb4574153dd6c77bbc62d173551a2369348d"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.229577 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" event={"ID":"357d30fc-7c29-4bea-a20a-926b5723bcb0","Type":"ContainerStarted","Data":"e14474e4a5e3267016cc140d7f484dcad9fcf8a3c8cb87f2fbcbe6b798530698"} Oct 07 17:18:56 crc kubenswrapper[4681]: E1007 17:18:56.233286 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" podUID="357d30fc-7c29-4bea-a20a-926b5723bcb0" Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.240230 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" event={"ID":"6e4f29f4-5ec2-4476-9153-954cc984443f","Type":"ContainerStarted","Data":"30feaea0af898efeb33fbb3af2f4cc91f397eeacee74a5a9dc22d104f02323e9"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.240430 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" event={"ID":"6e4f29f4-5ec2-4476-9153-954cc984443f","Type":"ContainerStarted","Data":"9201fb6c58301fcf492beef8883aa0fd8443739742168f21965813dfd35b721e"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.247569 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" event={"ID":"c3da478d-c5f4-473c-9848-740845c9adf1","Type":"ContainerStarted","Data":"178e49e4593aa6349d5f60be873eca922528dedc8a36385b7f28afa6c9cbb076"} Oct 07 17:18:56 crc kubenswrapper[4681]: E1007 17:18:56.248782 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" podUID="6e4f29f4-5ec2-4476-9153-954cc984443f" Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.258856 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" event={"ID":"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4","Type":"ContainerStarted","Data":"fc17bb8485c3b92ba0c46fc7786491d5f823779573d70ebe3c0e2be0eeb5c2f8"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.271074 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" event={"ID":"720e687c-21aa-4f31-bc4f-7be0f836ec16","Type":"ContainerStarted","Data":"31f9c56afb2d2d23296bf2186dd297988f6f0728bdf686f3d22e3be46e9621dc"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.281309 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" event={"ID":"9c9bc247-6ea6-486c-956c-292930b2c111","Type":"ContainerStarted","Data":"4ec079c3140b00b540a9536dbe684c15c044b9cbe06d75dda7e85a6a5603145a"} Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.281355 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" event={"ID":"9c9bc247-6ea6-486c-956c-292930b2c111","Type":"ContainerStarted","Data":"62062cfe6ef27de69d427d6c1e0ed29d8cd7924adc59a85a6f47695d3812dd71"} Oct 07 17:18:56 crc kubenswrapper[4681]: E1007 17:18:56.287550 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" podUID="9c9bc247-6ea6-486c-956c-292930b2c111" Oct 07 17:18:56 crc kubenswrapper[4681]: I1007 17:18:56.646962 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw"] Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.339491 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" event={"ID":"a7636f86-a942-4f89-bc80-01a3ce70c13e","Type":"ContainerStarted","Data":"6dc243dd11c4c5d6c3640436e372bb16eb0fe693f4c2c340c2f1b802c932d0c8"} Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.356436 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" event={"ID":"66b31094-5895-41aa-a268-fd2d13990f9f","Type":"ContainerStarted","Data":"310dc8dcdb16ebb008af6a1c3d3947b6454713e5bba5347617ff8ac2a670a8e8"} Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.356486 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.356496 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" event={"ID":"66b31094-5895-41aa-a268-fd2d13990f9f","Type":"ContainerStarted","Data":"d81070729319b4317cad6b4ca7397e08fefdaf0dec7d125dd33564201965c733"} Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.356504 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" event={"ID":"66b31094-5895-41aa-a268-fd2d13990f9f","Type":"ContainerStarted","Data":"f11cad51090f3615d14edd560303ec457c993ba904f25ed6b81cc81bfdaedd08"} Oct 07 17:18:57 crc kubenswrapper[4681]: E1007 17:18:57.365444 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" podUID="1a3899b1-53e5-413b-b1c1-c7d2f2274b75" Oct 07 17:18:57 crc kubenswrapper[4681]: E1007 17:18:57.365820 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" podUID="357d30fc-7c29-4bea-a20a-926b5723bcb0" Oct 07 17:18:57 crc kubenswrapper[4681]: E1007 17:18:57.365862 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" podUID="df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4" Oct 07 17:18:57 crc kubenswrapper[4681]: E1007 17:18:57.365969 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" podUID="6e4f29f4-5ec2-4476-9153-954cc984443f" Oct 07 17:18:57 crc kubenswrapper[4681]: E1007 17:18:57.368112 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" podUID="9c9bc247-6ea6-486c-956c-292930b2c111" Oct 07 17:18:57 crc kubenswrapper[4681]: I1007 17:18:57.522097 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" podStartSLOduration=4.52207729 podStartE2EDuration="4.52207729s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:18:57.504268531 +0000 UTC m=+941.151680096" watchObservedRunningTime="2025-10-07 17:18:57.52207729 +0000 UTC m=+941.169488835" Oct 07 17:19:05 crc kubenswrapper[4681]: I1007 17:19:05.885862 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-77dffbdc98-vqctw" Oct 07 17:19:09 crc kubenswrapper[4681]: E1007 17:19:09.094003 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:a6f1dcab931fd4b818010607ede65150742563b3c81a3ad3d739ef7953cace0b" Oct 07 17:19:09 crc kubenswrapper[4681]: E1007 17:19:09.094457 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:a6f1dcab931fd4b818010607ede65150742563b3c81a3ad3d739ef7953cace0b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9pk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7b5ccf6d9c-8m6q6_openstack-operators(bd602c09-19c5-45a7-b8fa-4202e147bbf9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:10 crc kubenswrapper[4681]: E1007 17:19:10.407346 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:785670b14b19ffd7e0799dcf3e3e275329fa822d4a604eace09574f8bb1f8162" Oct 07 17:19:10 crc kubenswrapper[4681]: E1007 17:19:10.407777 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:785670b14b19ffd7e0799dcf3e3e275329fa822d4a604eace09574f8bb1f8162,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s54xb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-649675d675-mt5xr_openstack-operators(049a3d2e-6274-44c0-8b56-d19e8d8b1cfc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:10 crc kubenswrapper[4681]: E1007 17:19:10.845662 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1" Oct 07 17:19:10 crc kubenswrapper[4681]: E1007 17:19:10.845948 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tc44p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-688l4_openstack-operators(97a43b61-b120-4613-9b2a-603e1d90878a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:12 crc kubenswrapper[4681]: E1007 17:19:12.376130 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830" Oct 07 17:19:12 crc kubenswrapper[4681]: E1007 17:19:12.376674 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zbvzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7d4d4f8d-c8nzf_openstack-operators(c7125c26-53ab-471e-bf33-05265e3f571a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:12 crc kubenswrapper[4681]: E1007 17:19:12.916312 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe" Oct 07 17:19:12 crc kubenswrapper[4681]: E1007 17:19:12.916508 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c4vws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8_openstack-operators(720e687c-21aa-4f31-bc4f-7be0f836ec16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:15 crc kubenswrapper[4681]: E1007 17:19:15.377134 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799" Oct 07 17:19:15 crc kubenswrapper[4681]: E1007 17:19:15.377593 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pr2p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665clkrns_openstack-operators(a7636f86-a942-4f89-bc80-01a3ce70c13e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:15 crc kubenswrapper[4681]: E1007 17:19:15.851071 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757" Oct 07 17:19:15 crc kubenswrapper[4681]: E1007 17:19:15.851284 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wd9f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-65d89cfd9f-spqlq_openstack-operators(5cc0eff1-427a-4489-8957-f5148e6a0630): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:17 crc kubenswrapper[4681]: E1007 17:19:17.651657 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b" Oct 07 17:19:17 crc kubenswrapper[4681]: E1007 17:19:17.651838 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gdthl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-75dfd9b554-9qrr7_openstack-operators(72f6dfae-3a77-46ad-874b-c94d9059566c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:18 crc kubenswrapper[4681]: E1007 17:19:18.084548 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862" Oct 07 17:19:18 crc kubenswrapper[4681]: E1007 17:19:18.084789 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mtz9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-8d984cc4d-plkk5_openstack-operators(52146033-65f3-42f4-b0b8-2b550445305f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.240126 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" podUID="049a3d2e-6274-44c0-8b56-d19e8d8b1cfc" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.297340 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" podUID="c7125c26-53ab-471e-bf33-05265e3f571a" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.307911 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" podUID="bd602c09-19c5-45a7-b8fa-4202e147bbf9" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.405985 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" podUID="97a43b61-b120-4613-9b2a-603e1d90878a" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.477354 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" podUID="720e687c-21aa-4f31-bc4f-7be0f836ec16" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.542599 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" podUID="72f6dfae-3a77-46ad-874b-c94d9059566c" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.545385 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" event={"ID":"9c9bc247-6ea6-486c-956c-292930b2c111","Type":"ContainerStarted","Data":"119c4c5ba4e373b082d5b6f277642ec519e54e33ad061fe43f0ae99d7baf4a47"} Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.545721 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.554176 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" event={"ID":"c3da478d-c5f4-473c-9848-740845c9adf1","Type":"ContainerStarted","Data":"6743849eb40d213dc013f60b48ad3cae9e9a0fb94b8f09df243bd03f54e1d6ab"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.567984 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" podUID="52146033-65f3-42f4-b0b8-2b550445305f" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.582691 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" podStartSLOduration=3.017170711 podStartE2EDuration="27.582672122s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.3881142 +0000 UTC m=+939.035525755" lastFinishedPulling="2025-10-07 17:19:19.953615611 +0000 UTC m=+963.601027166" observedRunningTime="2025-10-07 17:19:20.581846628 +0000 UTC m=+964.229258193" watchObservedRunningTime="2025-10-07 17:19:20.582672122 +0000 UTC m=+964.230083677" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.583625 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" event={"ID":"bd602c09-19c5-45a7-b8fa-4202e147bbf9","Type":"ContainerStarted","Data":"4780ff3c55b55f74b4ed498fef55c95ba0e82f7dca4090ebdfd59dab6cfeb535"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.584076 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b\\\"\"" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" podUID="72f6dfae-3a77-46ad-874b-c94d9059566c" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.591466 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" event={"ID":"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4","Type":"ContainerStarted","Data":"82c6ec37376f31eb9cdbed7a3646c26f4c60b8546fd92a742da8a0ed89ac2a1f"} Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.617038 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" event={"ID":"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28","Type":"ContainerStarted","Data":"1f8e14b7ab34fe077f94303fb06a985964c2c68985c33d4ac75c05074c3ce4e3"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.635950 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" podUID="a7636f86-a942-4f89-bc80-01a3ce70c13e" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.636595 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" event={"ID":"720e687c-21aa-4f31-bc4f-7be0f836ec16","Type":"ContainerStarted","Data":"be9fe526029851de615088f03a957db133ae5544866d57b7b35796992615cba8"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.638098 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" podUID="720e687c-21aa-4f31-bc4f-7be0f836ec16" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.658263 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" event={"ID":"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3","Type":"ContainerStarted","Data":"1926b04c14e8b3f9973fc78b642bac42e24798ad3b4217607445ab88694db7e0"} Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.662342 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" event={"ID":"c7125c26-53ab-471e-bf33-05265e3f571a","Type":"ContainerStarted","Data":"81d2f8cb1f36341a45ab807a63a17febf7bc97f2f36bfd55670fe00056806729"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.664660 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" podUID="c7125c26-53ab-471e-bf33-05265e3f571a" Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.674420 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" event={"ID":"8e8c5ada-0313-4a16-b9cd-17d39ce932ca","Type":"ContainerStarted","Data":"712be36ffd1c6bd7e7abc56cb2647be170a5b1f7d57ffc45b610fc4c53e4934a"} Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.678792 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" event={"ID":"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc","Type":"ContainerStarted","Data":"2c6e883fc697e0d934759c208fcf6a3cbb75e96984176b8462a63953a2669539"} Oct 07 17:19:20 crc kubenswrapper[4681]: I1007 17:19:20.687813 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" event={"ID":"97a43b61-b120-4613-9b2a-603e1d90878a","Type":"ContainerStarted","Data":"f0dded196652f10f8ef3d58e58a0baaf0d7a620340b2cd39f572d0617a97d138"} Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.700870 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" podUID="97a43b61-b120-4613-9b2a-603e1d90878a" Oct 07 17:19:20 crc kubenswrapper[4681]: E1007 17:19:20.778434 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" podUID="5cc0eff1-427a-4489-8957-f5148e6a0630" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.694465 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" event={"ID":"52146033-65f3-42f4-b0b8-2b550445305f","Type":"ContainerStarted","Data":"a154bb053914135d97894908837a56bfa8b39b63f5e2481d48b3ba2d887573f1"} Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.696357 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" podUID="52146033-65f3-42f4-b0b8-2b550445305f" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.697934 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" event={"ID":"5cc0eff1-427a-4489-8957-f5148e6a0630","Type":"ContainerStarted","Data":"e0a6ba13a73ec553fcaa64f089c33a20e2c19d48551dc6f1b8c054a1ced5137d"} Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.698969 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" podUID="5cc0eff1-427a-4489-8957-f5148e6a0630" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.700391 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" event={"ID":"c3da478d-c5f4-473c-9848-740845c9adf1","Type":"ContainerStarted","Data":"7bb7e407c840f85e03c923d996b3c8929977463a46b73641bb1587a31971e4e6"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.700563 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.702598 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" event={"ID":"df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4","Type":"ContainerStarted","Data":"def50c877cb822201a412bbe1aaae714883254a3a69a71096e4aff9140f4f903"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.703120 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.704246 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" event={"ID":"72f6dfae-3a77-46ad-874b-c94d9059566c","Type":"ContainerStarted","Data":"4350e1db72d73edf7b397725b0cc963c0deeb5510a60f4dc14af9045bf6de747"} Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.705112 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:585796b996a5b6d7ad68f0cb420bf4f2ee38c9f16f194e3111c162ce91ea8a7b\\\"\"" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" podUID="72f6dfae-3a77-46ad-874b-c94d9059566c" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.706846 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" event={"ID":"a7636f86-a942-4f89-bc80-01a3ce70c13e","Type":"ContainerStarted","Data":"caa9e93f7f60c7e045828fe220d3cc7536e7d108b6588e7f2110b13717e0f6f2"} Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.707936 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" podUID="a7636f86-a942-4f89-bc80-01a3ce70c13e" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.708710 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" event={"ID":"357d30fc-7c29-4bea-a20a-926b5723bcb0","Type":"ContainerStarted","Data":"ffcf779a670d34d48e775801038efc189bed75fc9bc420b354801b467a612ab8"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.710650 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" event={"ID":"6e4f29f4-5ec2-4476-9153-954cc984443f","Type":"ContainerStarted","Data":"d8c2d7cff410261882b38de3ad25c9c75c6d9bc6b5b9f49cb3ebf035daf36ad2"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.711048 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.712387 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" event={"ID":"be78a905-7f1e-4ea1-baf4-5f84246df65f","Type":"ContainerStarted","Data":"be6b405f21b33fb4cb7799d6127dc5c0932187d17658d0181dc2e759a1fbe635"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.714568 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" event={"ID":"1a3899b1-53e5-413b-b1c1-c7d2f2274b75","Type":"ContainerStarted","Data":"a9720b8aaa257324ee7e8026969c1604b1b169d48e01eda112b718d95744bb80"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.714914 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.716261 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" event={"ID":"049764d0-d62e-4553-9628-3d1b7258d126","Type":"ContainerStarted","Data":"ba0ba1e5d8a30f674f9d506b95fa0116251be0a3aa12024a6585c0ab1ed366b9"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.717759 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" event={"ID":"fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3","Type":"ContainerStarted","Data":"c4f4bf00d5a661bd8449d6cf1c7e1bd8a69ae1a04fa5bc6d1e0e8dca573408b7"} Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.719845 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" event={"ID":"c318e2b6-9014-471c-b54d-de14e50a1dfe","Type":"ContainerStarted","Data":"a5390967eb1ebf8280f366d4f58ef94b8c7c722a5273a219038e0189bba6b295"} Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.722000 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" podUID="720e687c-21aa-4f31-bc4f-7be0f836ec16" Oct 07 17:19:21 crc kubenswrapper[4681]: E1007 17:19:21.722124 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:445a1332c0eaaa21a5459d3ffe56a8696a6a61131c39dc7bb47571b251a30830\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" podUID="c7125c26-53ab-471e-bf33-05265e3f571a" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.797378 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" podStartSLOduration=4.231678873 podStartE2EDuration="28.797360479s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.383551212 +0000 UTC m=+939.030962767" lastFinishedPulling="2025-10-07 17:19:19.949232818 +0000 UTC m=+963.596644373" observedRunningTime="2025-10-07 17:19:21.79341835 +0000 UTC m=+965.440829925" watchObservedRunningTime="2025-10-07 17:19:21.797360479 +0000 UTC m=+965.444772034" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.943177 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" podStartSLOduration=4.645820347 podStartE2EDuration="28.943161794s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.521100216 +0000 UTC m=+939.168511771" lastFinishedPulling="2025-10-07 17:19:19.818441663 +0000 UTC m=+963.465853218" observedRunningTime="2025-10-07 17:19:21.938464243 +0000 UTC m=+965.585875798" watchObservedRunningTime="2025-10-07 17:19:21.943161794 +0000 UTC m=+965.590573349" Oct 07 17:19:21 crc kubenswrapper[4681]: I1007 17:19:21.999076 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" podStartSLOduration=4.547057378 podStartE2EDuration="28.999056676s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.513225187 +0000 UTC m=+939.160636742" lastFinishedPulling="2025-10-07 17:19:19.965224485 +0000 UTC m=+963.612636040" observedRunningTime="2025-10-07 17:19:21.996385332 +0000 UTC m=+965.643796887" watchObservedRunningTime="2025-10-07 17:19:21.999056676 +0000 UTC m=+965.646468231" Oct 07 17:19:22 crc kubenswrapper[4681]: I1007 17:19:22.020032 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" podStartSLOduration=4.346657877 podStartE2EDuration="29.020018013s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.108948637 +0000 UTC m=+938.756360192" lastFinishedPulling="2025-10-07 17:19:19.782308773 +0000 UTC m=+963.429720328" observedRunningTime="2025-10-07 17:19:22.015825155 +0000 UTC m=+965.663236710" watchObservedRunningTime="2025-10-07 17:19:22.020018013 +0000 UTC m=+965.667429568" Oct 07 17:19:22 crc kubenswrapper[4681]: I1007 17:19:22.728401 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" event={"ID":"10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28","Type":"ContainerStarted","Data":"7d346d149261fe81c07bd906a2083e1fcbdf0acc26f657b0a14eebe455f1b30d"} Oct 07 17:19:22 crc kubenswrapper[4681]: E1007 17:19:22.730123 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" podUID="a7636f86-a942-4f89-bc80-01a3ce70c13e" Oct 07 17:19:22 crc kubenswrapper[4681]: E1007 17:19:22.730129 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" podUID="52146033-65f3-42f4-b0b8-2b550445305f" Oct 07 17:19:22 crc kubenswrapper[4681]: E1007 17:19:22.730624 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" podUID="5cc0eff1-427a-4489-8957-f5148e6a0630" Oct 07 17:19:22 crc kubenswrapper[4681]: I1007 17:19:22.763065 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv" podStartSLOduration=5.331036287 podStartE2EDuration="29.763046958s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.518151944 +0000 UTC m=+939.165563499" lastFinishedPulling="2025-10-07 17:19:19.950162605 +0000 UTC m=+963.597574170" observedRunningTime="2025-10-07 17:19:22.043958781 +0000 UTC m=+965.691370336" watchObservedRunningTime="2025-10-07 17:19:22.763046958 +0000 UTC m=+966.410458513" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.745155 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" event={"ID":"f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4","Type":"ContainerStarted","Data":"394e5a97271b86955dbd08c1643863067a5171fcc5fede845a8c9c840f8cd176"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.746296 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.749306 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" event={"ID":"be78a905-7f1e-4ea1-baf4-5f84246df65f","Type":"ContainerStarted","Data":"70626fda3521c33f7c7353d78dff87a04209a4d882c7d17e5759791ec05b3a85"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.749831 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.755400 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" event={"ID":"049a3d2e-6274-44c0-8b56-d19e8d8b1cfc","Type":"ContainerStarted","Data":"113687b0f3f2f596f2f8c63b8e9bdec7fd077a43e19f83e452b2610282447146"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.755517 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.758331 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" event={"ID":"97a43b61-b120-4613-9b2a-603e1d90878a","Type":"ContainerStarted","Data":"35ce7fc99293568f9b45ebb7231454cd3a37dd3d63d6045ee64809400e776fb3"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.758517 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.760029 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" event={"ID":"049764d0-d62e-4553-9628-3d1b7258d126","Type":"ContainerStarted","Data":"827238ac998968a218ed5e5b913471eb1ba44f58da74ead85aa635e0c4535572"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.760149 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.761810 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" event={"ID":"8e8c5ada-0313-4a16-b9cd-17d39ce932ca","Type":"ContainerStarted","Data":"e65e2a61c5e2d42603bb8938a197485d0594ad77d39a448a6fa89d1c93ad7c2b"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.761872 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.763118 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" event={"ID":"c318e2b6-9014-471c-b54d-de14e50a1dfe","Type":"ContainerStarted","Data":"fc1fc86f6a4d1fdcba5f2a80a6c42fa60635d4d3ca737bb7c3d61b9594b833c2"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.763238 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.764965 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" event={"ID":"bd602c09-19c5-45a7-b8fa-4202e147bbf9","Type":"ContainerStarted","Data":"63410326b80ccc4bba51d22a129590ae6feead1cf8b3a8de73b7dcd1370d9754"} Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.765155 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.779819 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" podStartSLOduration=7.155553688 podStartE2EDuration="30.779795114s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.734289604 +0000 UTC m=+939.381701159" lastFinishedPulling="2025-10-07 17:19:19.35853104 +0000 UTC m=+963.005942585" observedRunningTime="2025-10-07 17:19:23.774103365 +0000 UTC m=+967.421514930" watchObservedRunningTime="2025-10-07 17:19:23.779795114 +0000 UTC m=+967.427206669" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.826797 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" podStartSLOduration=7.242258625 podStartE2EDuration="31.826772357s" podCreationTimestamp="2025-10-07 17:18:52 +0000 UTC" firstStartedPulling="2025-10-07 17:18:54.772644829 +0000 UTC m=+938.420056384" lastFinishedPulling="2025-10-07 17:19:19.357158561 +0000 UTC m=+963.004570116" observedRunningTime="2025-10-07 17:19:23.812348523 +0000 UTC m=+967.459760078" watchObservedRunningTime="2025-10-07 17:19:23.826772357 +0000 UTC m=+967.474183912" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.845004 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" podStartSLOduration=2.88902475 podStartE2EDuration="30.844988006s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.359908602 +0000 UTC m=+939.007320157" lastFinishedPulling="2025-10-07 17:19:23.315871858 +0000 UTC m=+966.963283413" observedRunningTime="2025-10-07 17:19:23.842203728 +0000 UTC m=+967.489615273" watchObservedRunningTime="2025-10-07 17:19:23.844988006 +0000 UTC m=+967.492399561" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.856816 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.866446 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" podStartSLOduration=6.630939809 podStartE2EDuration="30.866431116s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.121710135 +0000 UTC m=+938.769121690" lastFinishedPulling="2025-10-07 17:19:19.357201442 +0000 UTC m=+963.004612997" observedRunningTime="2025-10-07 17:19:23.862900126 +0000 UTC m=+967.510311681" watchObservedRunningTime="2025-10-07 17:19:23.866431116 +0000 UTC m=+967.513842671" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.882821 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" podStartSLOduration=6.743985846 podStartE2EDuration="30.882804953s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.218359355 +0000 UTC m=+938.865770910" lastFinishedPulling="2025-10-07 17:19:19.357178462 +0000 UTC m=+963.004590017" observedRunningTime="2025-10-07 17:19:23.881959779 +0000 UTC m=+967.529371334" watchObservedRunningTime="2025-10-07 17:19:23.882804953 +0000 UTC m=+967.530216508" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.905376 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" podStartSLOduration=2.749648175 podStartE2EDuration="30.905360614s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.027658906 +0000 UTC m=+938.675070461" lastFinishedPulling="2025-10-07 17:19:23.183371355 +0000 UTC m=+966.830782900" observedRunningTime="2025-10-07 17:19:23.90202588 +0000 UTC m=+967.549437445" watchObservedRunningTime="2025-10-07 17:19:23.905360614 +0000 UTC m=+967.552772169" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.925477 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" podStartSLOduration=7.069646738 podStartE2EDuration="30.925459815s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.503011191 +0000 UTC m=+939.150422746" lastFinishedPulling="2025-10-07 17:19:19.358824268 +0000 UTC m=+963.006235823" observedRunningTime="2025-10-07 17:19:23.92526542 +0000 UTC m=+967.572676975" watchObservedRunningTime="2025-10-07 17:19:23.925459815 +0000 UTC m=+967.572871360" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.940968 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" podStartSLOduration=6.785494457 podStartE2EDuration="30.940953358s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.201776342 +0000 UTC m=+938.849187897" lastFinishedPulling="2025-10-07 17:19:19.357235253 +0000 UTC m=+963.004646798" observedRunningTime="2025-10-07 17:19:23.937644285 +0000 UTC m=+967.585055840" watchObservedRunningTime="2025-10-07 17:19:23.940953358 +0000 UTC m=+967.588364903" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.958620 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" podStartSLOduration=4.948465936 podStartE2EDuration="30.958602602s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.110363997 +0000 UTC m=+938.757775552" lastFinishedPulling="2025-10-07 17:19:21.120500663 +0000 UTC m=+964.767912218" observedRunningTime="2025-10-07 17:19:23.955132354 +0000 UTC m=+967.602543909" watchObservedRunningTime="2025-10-07 17:19:23.958602602 +0000 UTC m=+967.606014157" Oct 07 17:19:23 crc kubenswrapper[4681]: I1007 17:19:23.987275 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" podStartSLOduration=7.168441041 podStartE2EDuration="31.987256552s" podCreationTimestamp="2025-10-07 17:18:52 +0000 UTC" firstStartedPulling="2025-10-07 17:18:54.538043242 +0000 UTC m=+938.185454797" lastFinishedPulling="2025-10-07 17:19:19.356858763 +0000 UTC m=+963.004270308" observedRunningTime="2025-10-07 17:19:23.984057263 +0000 UTC m=+967.631468818" watchObservedRunningTime="2025-10-07 17:19:23.987256552 +0000 UTC m=+967.634668107" Oct 07 17:19:25 crc kubenswrapper[4681]: I1007 17:19:25.786887 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-9fwcg" Oct 07 17:19:32 crc kubenswrapper[4681]: I1007 17:19:32.031383 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:19:32 crc kubenswrapper[4681]: I1007 17:19:32.832193 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" event={"ID":"720e687c-21aa-4f31-bc4f-7be0f836ec16","Type":"ContainerStarted","Data":"f395718bc14516c4486858523bec14f175868013ae4baab78ed44a0c5ff93240"} Oct 07 17:19:32 crc kubenswrapper[4681]: I1007 17:19:32.832673 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:19:32 crc kubenswrapper[4681]: I1007 17:19:32.852803 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" podStartSLOduration=2.634286265 podStartE2EDuration="39.85278306s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.226224595 +0000 UTC m=+938.873636150" lastFinishedPulling="2025-10-07 17:19:32.44472139 +0000 UTC m=+976.092132945" observedRunningTime="2025-10-07 17:19:32.846209436 +0000 UTC m=+976.493621011" watchObservedRunningTime="2025-10-07 17:19:32.85278306 +0000 UTC m=+976.500194615" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.285204 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.287188 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-dhcz7" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.386681 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-rhvt8" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.389069 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-98wq4" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.412459 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-8qj4n" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.556989 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8m6q6" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.590562 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-mt5xr" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.825712 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-vh9d7" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.836309 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-t75k9" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.860238 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-v7gh2" Oct 07 17:19:33 crc kubenswrapper[4681]: I1007 17:19:33.969232 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-688l4" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.050690 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-bxvlr" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.099345 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-zhppp" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.178866 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-svd7g" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.195612 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gn84j" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.846642 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" event={"ID":"52146033-65f3-42f4-b0b8-2b550445305f","Type":"ContainerStarted","Data":"fe6cc59d9315b273b71f266e829c83333c255a4ad9a770c6ea36cf8fa1a1ae05"} Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.846849 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.848248 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" event={"ID":"72f6dfae-3a77-46ad-874b-c94d9059566c","Type":"ContainerStarted","Data":"bcc6c83241bd563226a44f6fade846f07e0970021b525a3ede00c0fcd59d58d9"} Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.848545 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.867354 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" podStartSLOduration=2.70019601 podStartE2EDuration="41.867332509s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.382266076 +0000 UTC m=+939.029677631" lastFinishedPulling="2025-10-07 17:19:34.549402565 +0000 UTC m=+978.196814130" observedRunningTime="2025-10-07 17:19:34.866567647 +0000 UTC m=+978.513979202" watchObservedRunningTime="2025-10-07 17:19:34.867332509 +0000 UTC m=+978.514744074" Oct 07 17:19:34 crc kubenswrapper[4681]: I1007 17:19:34.889023 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" podStartSLOduration=3.0815984 podStartE2EDuration="42.889000803s" podCreationTimestamp="2025-10-07 17:18:52 +0000 UTC" firstStartedPulling="2025-10-07 17:18:54.741230531 +0000 UTC m=+938.388642086" lastFinishedPulling="2025-10-07 17:19:34.548632934 +0000 UTC m=+978.196044489" observedRunningTime="2025-10-07 17:19:34.885526857 +0000 UTC m=+978.532938422" watchObservedRunningTime="2025-10-07 17:19:34.889000803 +0000 UTC m=+978.536412368" Oct 07 17:19:35 crc kubenswrapper[4681]: I1007 17:19:35.856404 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" event={"ID":"c7125c26-53ab-471e-bf33-05265e3f571a","Type":"ContainerStarted","Data":"0035cafddfec5f521ec6af2a97c440af31b3b117254bc5f43e839a6623253345"} Oct 07 17:19:35 crc kubenswrapper[4681]: I1007 17:19:35.857224 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:19:35 crc kubenswrapper[4681]: I1007 17:19:35.874763 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" podStartSLOduration=2.744230602 podStartE2EDuration="43.874747377s" podCreationTimestamp="2025-10-07 17:18:52 +0000 UTC" firstStartedPulling="2025-10-07 17:18:54.411255218 +0000 UTC m=+938.058666773" lastFinishedPulling="2025-10-07 17:19:35.541771993 +0000 UTC m=+979.189183548" observedRunningTime="2025-10-07 17:19:35.871844346 +0000 UTC m=+979.519255901" watchObservedRunningTime="2025-10-07 17:19:35.874747377 +0000 UTC m=+979.522158932" Oct 07 17:19:36 crc kubenswrapper[4681]: I1007 17:19:36.864636 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" event={"ID":"5cc0eff1-427a-4489-8957-f5148e6a0630","Type":"ContainerStarted","Data":"6a2b97b039fc80aa8de3710eb7a10f27c30d91aff1ed51b9e173ef2939b3644e"} Oct 07 17:19:36 crc kubenswrapper[4681]: I1007 17:19:36.864913 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:19:36 crc kubenswrapper[4681]: I1007 17:19:36.878282 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" podStartSLOduration=2.584278865 podStartE2EDuration="43.878264687s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:55.125463099 +0000 UTC m=+938.772874654" lastFinishedPulling="2025-10-07 17:19:36.419448921 +0000 UTC m=+980.066860476" observedRunningTime="2025-10-07 17:19:36.877865676 +0000 UTC m=+980.525277231" watchObservedRunningTime="2025-10-07 17:19:36.878264687 +0000 UTC m=+980.525676242" Oct 07 17:19:37 crc kubenswrapper[4681]: I1007 17:19:37.877628 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" event={"ID":"a7636f86-a942-4f89-bc80-01a3ce70c13e","Type":"ContainerStarted","Data":"2c424271057adbba1702a1e4d59232cbfbf0d719f2ab1531325ccf5493cee443"} Oct 07 17:19:37 crc kubenswrapper[4681]: I1007 17:19:37.878490 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:19:37 crc kubenswrapper[4681]: I1007 17:19:37.906617 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" podStartSLOduration=3.465835205 podStartE2EDuration="44.906599681s" podCreationTimestamp="2025-10-07 17:18:53 +0000 UTC" firstStartedPulling="2025-10-07 17:18:56.059004259 +0000 UTC m=+939.706415804" lastFinishedPulling="2025-10-07 17:19:37.499768735 +0000 UTC m=+981.147180280" observedRunningTime="2025-10-07 17:19:37.90226027 +0000 UTC m=+981.549671825" watchObservedRunningTime="2025-10-07 17:19:37.906599681 +0000 UTC m=+981.554011236" Oct 07 17:19:43 crc kubenswrapper[4681]: I1007 17:19:43.324662 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-c8nzf" Oct 07 17:19:43 crc kubenswrapper[4681]: I1007 17:19:43.340719 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-9qrr7" Oct 07 17:19:43 crc kubenswrapper[4681]: I1007 17:19:43.730861 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-spqlq" Oct 07 17:19:43 crc kubenswrapper[4681]: I1007 17:19:43.738399 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8" Oct 07 17:19:43 crc kubenswrapper[4681]: I1007 17:19:43.782191 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-plkk5" Oct 07 17:19:45 crc kubenswrapper[4681]: I1007 17:19:45.390235 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665clkrns" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.503459 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.505131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.506705 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.507692 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.507945 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.508082 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bs7bb" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.522497 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.563329 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.565002 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.576180 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.583200 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.660171 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.660236 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksh6\" (UniqueName: \"kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.761925 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlm5p\" (UniqueName: \"kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.761972 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.762002 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.762027 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksh6\" (UniqueName: \"kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.762063 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.762831 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.802696 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksh6\" (UniqueName: \"kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6\") pod \"dnsmasq-dns-675f4bcbfc-cm2kn\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.820577 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.862977 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.863312 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.863374 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlm5p\" (UniqueName: \"kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.863975 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.864068 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:05 crc kubenswrapper[4681]: I1007 17:20:05.901803 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlm5p\" (UniqueName: \"kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p\") pod \"dnsmasq-dns-78dd6ddcc-2p5z8\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:06 crc kubenswrapper[4681]: I1007 17:20:06.177910 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:06 crc kubenswrapper[4681]: I1007 17:20:06.401407 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:06 crc kubenswrapper[4681]: I1007 17:20:06.595304 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:07 crc kubenswrapper[4681]: I1007 17:20:07.073680 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" event={"ID":"6b21a245-8e31-4b8a-be69-f0b874735531","Type":"ContainerStarted","Data":"3c6757adce13418b062bca5421ed400286aded94eacba4d022b94e1897e151ae"} Oct 07 17:20:07 crc kubenswrapper[4681]: I1007 17:20:07.077056 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" event={"ID":"7ae6d556-e120-4432-82c1-b809d98ec9c9","Type":"ContainerStarted","Data":"dd72d298da3df8eae16b6542f7f9df71a5c8e7f550a4e3aa3196deeb633f0ff1"} Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.768169 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.812548 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.813695 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.831758 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.921596 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.921661 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hsrs\" (UniqueName: \"kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:08 crc kubenswrapper[4681]: I1007 17:20:08.921719 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.027960 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.028029 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hsrs\" (UniqueName: \"kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.028075 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.028914 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.029004 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.073214 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hsrs\" (UniqueName: \"kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs\") pod \"dnsmasq-dns-666b6646f7-g2v94\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.135568 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.151922 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.209989 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.211303 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.231568 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.231639 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnzw\" (UniqueName: \"kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.231711 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.263102 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.333403 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.333458 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnzw\" (UniqueName: \"kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.333545 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.334780 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.336445 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.403336 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnzw\" (UniqueName: \"kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw\") pod \"dnsmasq-dns-57d769cc4f-9z6k8\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.633367 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:20:09 crc kubenswrapper[4681]: I1007 17:20:09.898085 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:20:09 crc kubenswrapper[4681]: W1007 17:20:09.910993 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd94b9739_e609_4d7c_a7f2_813b90d33fdd.slice/crio-616b8c7cb91bbb460329bbf15d09221e3532f68423f9b67b395b0a061c4f57cd WatchSource:0}: Error finding container 616b8c7cb91bbb460329bbf15d09221e3532f68423f9b67b395b0a061c4f57cd: Status 404 returned error can't find the container with id 616b8c7cb91bbb460329bbf15d09221e3532f68423f9b67b395b0a061c4f57cd Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.002422 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.007698 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.011582 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.013385 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.013539 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.013706 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6258" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.013834 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.013987 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.014047 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.027923 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.125220 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" event={"ID":"d94b9739-e609-4d7c-a7f2-813b90d33fdd","Type":"ContainerStarted","Data":"616b8c7cb91bbb460329bbf15d09221e3532f68423f9b67b395b0a061c4f57cd"} Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149501 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149571 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149598 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149679 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149738 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149808 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149842 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mwt\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149910 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149944 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.149994 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.150021 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252144 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252199 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252228 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252294 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252369 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252388 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mwt\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252409 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252452 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252468 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.252484 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.254766 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.255151 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.255826 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.256810 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.257819 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.257932 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.258750 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.261206 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.262402 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.281563 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.284045 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.287001 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mwt\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt\") pod \"rabbitmq-server-0\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.332329 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.333835 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.606742 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.608653 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.611662 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612168 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612375 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612412 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612379 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612583 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-thq44" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.612744 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.626536 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763249 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763443 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763488 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763535 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763589 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c99dx\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763611 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763661 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763686 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763736 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.763764 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.800616 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:20:10 crc kubenswrapper[4681]: W1007 17:20:10.843509 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a71bcd_3178_4394_8031_673c93a6981e.slice/crio-aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351 WatchSource:0}: Error finding container aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351: Status 404 returned error can't find the container with id aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351 Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865641 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865693 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865714 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865743 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865766 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c99dx\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.865785 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.866364 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.866364 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.866505 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.866664 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.867189 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.867270 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.876525 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.877487 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.880446 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.882378 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.890402 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c99dx\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.873009 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.899948 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.900003 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.900533 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.904865 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.924924 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:10 crc kubenswrapper[4681]: I1007 17:20:10.962485 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.150057 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerStarted","Data":"aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351"} Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.155003 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" event={"ID":"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea","Type":"ContainerStarted","Data":"d4bc18a318b3b5ab3352ed27be0fb62d07e4012542a676d1ccfc17e12562c526"} Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.455114 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.952513 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.954365 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.966313 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.966528 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.966758 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-s49nj" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.966868 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.967088 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 17:20:11 crc kubenswrapper[4681]: I1007 17:20:11.999509 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.001200 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.049420 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.087286 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-secrets\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.087544 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088179 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088295 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088341 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088460 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088508 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfsq\" (UniqueName: \"kubernetes.io/projected/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kube-api-access-hrfsq\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.088544 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194220 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194298 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194402 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194440 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrfsq\" (UniqueName: \"kubernetes.io/projected/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kube-api-access-hrfsq\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194463 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194506 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194527 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-secrets\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194551 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.194580 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.195593 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.196192 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kolla-config\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.196381 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.196721 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-config-data-default\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.198015 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.208040 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-secrets\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.208237 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerStarted","Data":"88c2feb12155a70577c58a0b6248c830b305f10eccfbd872c73d354a9e85b472"} Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.220434 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.220916 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.227766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrfsq\" (UniqueName: \"kubernetes.io/projected/7d261af7-bc67-4638-8b4c-1f7a7cb129a2-kube-api-access-hrfsq\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.266063 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"7d261af7-bc67-4638-8b4c-1f7a7cb129a2\") " pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.284487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.919456 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.921163 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.926171 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.926439 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rr98r" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.926604 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.927290 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 17:20:12 crc kubenswrapper[4681]: I1007 17:20:12.943369 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011639 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdww\" (UniqueName: \"kubernetes.io/projected/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kube-api-access-5wdww\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011703 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011729 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011799 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011838 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011869 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011924 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011969 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.011994 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.015795 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113422 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113506 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113525 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113555 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdww\" (UniqueName: \"kubernetes.io/projected/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kube-api-access-5wdww\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113574 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113588 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113681 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113711 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.113735 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.114833 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.115146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.115192 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.115429 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61391679-2b8c-4be3-b3d7-bd2d3e667c15-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.115642 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.121028 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.135188 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.141373 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdww\" (UniqueName: \"kubernetes.io/projected/61391679-2b8c-4be3-b3d7-bd2d3e667c15-kube-api-access-5wdww\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.144710 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/61391679-2b8c-4be3-b3d7-bd2d3e667c15-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.157970 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"61391679-2b8c-4be3-b3d7-bd2d3e667c15\") " pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.251504 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.285506 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d261af7-bc67-4638-8b4c-1f7a7cb129a2","Type":"ContainerStarted","Data":"065103f5c4fdf3517115926e9f3f32fa23e6045c71f0ac3a9a2ac41aa78b8b25"} Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.510745 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.511948 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.514407 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-22dlb" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.514654 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.516154 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.544354 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.646739 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.646787 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kolla-config\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.646828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-config-data\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.646848 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldxzt\" (UniqueName: \"kubernetes.io/projected/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kube-api-access-ldxzt\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.646912 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.751580 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.751962 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.751993 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kolla-config\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.752059 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-config-data\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.752078 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldxzt\" (UniqueName: \"kubernetes.io/projected/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kube-api-access-ldxzt\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.754090 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-config-data\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.754909 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kolla-config\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.763730 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.775556 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldxzt\" (UniqueName: \"kubernetes.io/projected/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-kube-api-access-ldxzt\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.780112 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa4b182-3cf3-4e5d-b59d-5e00004cb912-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1aa4b182-3cf3-4e5d-b59d-5e00004cb912\") " pod="openstack/memcached-0" Oct 07 17:20:13 crc kubenswrapper[4681]: I1007 17:20:13.844178 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 17:20:14 crc kubenswrapper[4681]: I1007 17:20:14.113556 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.189962 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 17:20:15 crc kubenswrapper[4681]: W1007 17:20:15.201540 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa4b182_3cf3_4e5d_b59d_5e00004cb912.slice/crio-ce498487fc70281a71f1883220294efd1a444b7f40ddcef9791f9cb0b3c3790a WatchSource:0}: Error finding container ce498487fc70281a71f1883220294efd1a444b7f40ddcef9791f9cb0b3c3790a: Status 404 returned error can't find the container with id ce498487fc70281a71f1883220294efd1a444b7f40ddcef9791f9cb0b3c3790a Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.438497 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"61391679-2b8c-4be3-b3d7-bd2d3e667c15","Type":"ContainerStarted","Data":"9a15a7b5d4207845386a33fe67e2ebe221f312ec722f666ef7d7e8bb99ab20e2"} Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.476699 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1aa4b182-3cf3-4e5d-b59d-5e00004cb912","Type":"ContainerStarted","Data":"ce498487fc70281a71f1883220294efd1a444b7f40ddcef9791f9cb0b3c3790a"} Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.532127 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.533955 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.537267 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7jtff" Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.547314 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.666772 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl6tt\" (UniqueName: \"kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt\") pod \"kube-state-metrics-0\" (UID: \"88c0d090-0803-4fff-a9a3-9b41529b8a23\") " pod="openstack/kube-state-metrics-0" Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.768461 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl6tt\" (UniqueName: \"kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt\") pod \"kube-state-metrics-0\" (UID: \"88c0d090-0803-4fff-a9a3-9b41529b8a23\") " pod="openstack/kube-state-metrics-0" Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.812930 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl6tt\" (UniqueName: \"kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt\") pod \"kube-state-metrics-0\" (UID: \"88c0d090-0803-4fff-a9a3-9b41529b8a23\") " pod="openstack/kube-state-metrics-0" Oct 07 17:20:15 crc kubenswrapper[4681]: I1007 17:20:15.985666 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:20:16 crc kubenswrapper[4681]: I1007 17:20:16.532812 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:20:16 crc kubenswrapper[4681]: W1007 17:20:16.612448 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c0d090_0803_4fff_a9a3_9b41529b8a23.slice/crio-14706f92fb10e846b1e3989b0f6e54ec10fd237202a3b8630730084e83732610 WatchSource:0}: Error finding container 14706f92fb10e846b1e3989b0f6e54ec10fd237202a3b8630730084e83732610: Status 404 returned error can't find the container with id 14706f92fb10e846b1e3989b0f6e54ec10fd237202a3b8630730084e83732610 Oct 07 17:20:17 crc kubenswrapper[4681]: I1007 17:20:17.517814 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c0d090-0803-4fff-a9a3-9b41529b8a23","Type":"ContainerStarted","Data":"14706f92fb10e846b1e3989b0f6e54ec10fd237202a3b8630730084e83732610"} Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.485953 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xhwkc"] Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.487289 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.494384 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.501765 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.502078 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rfps4" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.529154 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xhwkc"] Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536126 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-combined-ca-bundle\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536288 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536309 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-log-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536494 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be45f14-7feb-40fa-a0a8-919c6d8cd052-scripts\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536549 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77h67\" (UniqueName: \"kubernetes.io/projected/8be45f14-7feb-40fa-a0a8-919c6d8cd052-kube-api-access-77h67\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.536582 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-ovn-controller-tls-certs\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.546739 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6tf88"] Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.550221 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.560026 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6tf88"] Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-log\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644512 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644565 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-log-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644702 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4fbb\" (UniqueName: \"kubernetes.io/projected/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-kube-api-access-c4fbb\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644844 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be45f14-7feb-40fa-a0a8-919c6d8cd052-scripts\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644868 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-lib\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.644990 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77h67\" (UniqueName: \"kubernetes.io/projected/8be45f14-7feb-40fa-a0a8-919c6d8cd052-kube-api-access-77h67\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645046 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-ovn-controller-tls-certs\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645187 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-scripts\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645221 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645275 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-combined-ca-bundle\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645302 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-run\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.645327 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-etc-ovs\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.646296 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.646692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-log-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.648116 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8be45f14-7feb-40fa-a0a8-919c6d8cd052-var-run-ovn\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.658415 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-combined-ca-bundle\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.662690 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be45f14-7feb-40fa-a0a8-919c6d8cd052-scripts\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.667716 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77h67\" (UniqueName: \"kubernetes.io/projected/8be45f14-7feb-40fa-a0a8-919c6d8cd052-kube-api-access-77h67\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.670456 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be45f14-7feb-40fa-a0a8-919c6d8cd052-ovn-controller-tls-certs\") pod \"ovn-controller-xhwkc\" (UID: \"8be45f14-7feb-40fa-a0a8-919c6d8cd052\") " pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746734 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4fbb\" (UniqueName: \"kubernetes.io/projected/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-kube-api-access-c4fbb\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746833 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-lib\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746912 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-scripts\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746940 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-run\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746957 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-etc-ovs\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.746975 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-log\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.747272 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-log\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.747425 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-lib\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.748767 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-var-run\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.750436 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-scripts\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.756588 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-etc-ovs\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.763311 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4fbb\" (UniqueName: \"kubernetes.io/projected/6a172508-6850-4bf5-8e7f-6c6674c4a1ee-kube-api-access-c4fbb\") pod \"ovn-controller-ovs-6tf88\" (UID: \"6a172508-6850-4bf5-8e7f-6c6674c4a1ee\") " pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.815601 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:18 crc kubenswrapper[4681]: I1007 17:20:18.903261 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.589065 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.597034 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.600348 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-956zl" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.600506 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.600658 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.600760 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.600853 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.618628 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.687866 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbhr\" (UniqueName: \"kubernetes.io/projected/9de0f04a-f2ed-48ee-a873-8a02b70fb146-kube-api-access-8vbhr\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.687951 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.687982 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.688054 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.688140 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.688216 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-config\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.688244 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.688396 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.789686 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-config\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.789772 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.789836 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.789947 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbhr\" (UniqueName: \"kubernetes.io/projected/9de0f04a-f2ed-48ee-a873-8a02b70fb146-kube-api-access-8vbhr\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.789987 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.790018 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.790049 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.790094 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.790531 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-config\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.790866 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.791282 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.792264 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9de0f04a-f2ed-48ee-a873-8a02b70fb146-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.805131 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.809213 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbhr\" (UniqueName: \"kubernetes.io/projected/9de0f04a-f2ed-48ee-a873-8a02b70fb146-kube-api-access-8vbhr\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.825353 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.836064 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.841607 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de0f04a-f2ed-48ee-a873-8a02b70fb146-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9de0f04a-f2ed-48ee-a873-8a02b70fb146\") " pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:20 crc kubenswrapper[4681]: I1007 17:20:20.920815 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.329209 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.330450 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.334925 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.334944 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.335112 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.335200 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rjmmw" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.335281 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425300 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425367 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425417 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpdbw\" (UniqueName: \"kubernetes.io/projected/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-kube-api-access-rpdbw\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425444 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425483 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425527 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425552 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.425574 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.527175 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.528428 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.528638 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547198 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547294 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547420 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547471 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547538 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpdbw\" (UniqueName: \"kubernetes.io/projected/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-kube-api-access-rpdbw\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.547568 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.548082 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.549276 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-config\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.555746 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.560340 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.575037 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.582136 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.599590 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpdbw\" (UniqueName: \"kubernetes.io/projected/d5d2debf-c5bb-47fa-9d33-69c2f549a3e0-kube-api-access-rpdbw\") pod \"ovsdbserver-sb-0\" (UID: \"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0\") " pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:22 crc kubenswrapper[4681]: I1007 17:20:22.668253 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.290775 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.291627 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6mwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(44a71bcd-3178-4394-8031-673c93a6981e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.292809 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="44a71bcd-3178-4394-8031-673c93a6981e" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.718643 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="44a71bcd-3178-4394-8031-673c93a6981e" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.912182 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.912422 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n665h5d7h569hdfh68fh648hf6h68h557h6fh665hd6h6ch54bh55chc9hbbh4h585h68dh566h5d4h694hc9h64bh5dbhb7h59ch5dbh677h95h78q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldxzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(1aa4b182-3cf3-4e5d-b59d-5e00004cb912): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.913601 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="1aa4b182-3cf3-4e5d-b59d-5e00004cb912" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.926602 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.926855 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c99dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(c8a62bbf-000f-4b40-87e9-8dad6f714178): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:41 crc kubenswrapper[4681]: E1007 17:20:41.928109 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" Oct 07 17:20:42 crc kubenswrapper[4681]: I1007 17:20:42.195497 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:20:42 crc kubenswrapper[4681]: I1007 17:20:42.195556 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:20:42 crc kubenswrapper[4681]: E1007 17:20:42.725080 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" Oct 07 17:20:42 crc kubenswrapper[4681]: E1007 17:20:42.725083 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="1aa4b182-3cf3-4e5d-b59d-5e00004cb912" Oct 07 17:20:47 crc kubenswrapper[4681]: E1007 17:20:47.633587 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 07 17:20:47 crc kubenswrapper[4681]: E1007 17:20:47.634525 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wdww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(61391679-2b8c-4be3-b3d7-bd2d3e667c15): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:47 crc kubenswrapper[4681]: E1007 17:20:47.635865 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="61391679-2b8c-4be3-b3d7-bd2d3e667c15" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.340054 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.340566 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gpnzw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-9z6k8_openstack(d0afd9d9-c53a-450f-8aa1-57aa2996d9ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.341810 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" podUID="d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.352088 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.352303 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hsrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-g2v94_openstack(d94b9739-e609-4d7c-a7f2-813b90d33fdd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.353492 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" podUID="d94b9739-e609-4d7c-a7f2-813b90d33fdd" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.640905 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.641097 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cksh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-cm2kn_openstack(7ae6d556-e120-4432-82c1-b809d98ec9c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.642286 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" podUID="7ae6d556-e120-4432-82c1-b809d98ec9c9" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.670445 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.670688 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlm5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-2p5z8_openstack(6b21a245-8e31-4b8a-be69-f0b874735531): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.671869 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" podUID="6b21a245-8e31-4b8a-be69-f0b874735531" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.814330 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" podUID="d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" Oct 07 17:20:48 crc kubenswrapper[4681]: E1007 17:20:48.814405 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" podUID="d94b9739-e609-4d7c-a7f2-813b90d33fdd" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.338656 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.346437 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:49 crc kubenswrapper[4681]: E1007 17:20:49.379065 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 07 17:20:49 crc kubenswrapper[4681]: E1007 17:20:49.379120 4681 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 07 17:20:49 crc kubenswrapper[4681]: E1007 17:20:49.380010 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vl6tt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(88c0d090-0803-4fff-a9a3-9b41529b8a23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Oct 07 17:20:49 crc kubenswrapper[4681]: E1007 17:20:49.381195 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.477435 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.496388 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc\") pod \"6b21a245-8e31-4b8a-be69-f0b874735531\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.496457 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksh6\" (UniqueName: \"kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6\") pod \"7ae6d556-e120-4432-82c1-b809d98ec9c9\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.496528 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlm5p\" (UniqueName: \"kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p\") pod \"6b21a245-8e31-4b8a-be69-f0b874735531\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.496579 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config\") pod \"6b21a245-8e31-4b8a-be69-f0b874735531\" (UID: \"6b21a245-8e31-4b8a-be69-f0b874735531\") " Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.496631 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config\") pod \"7ae6d556-e120-4432-82c1-b809d98ec9c9\" (UID: \"7ae6d556-e120-4432-82c1-b809d98ec9c9\") " Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.498088 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b21a245-8e31-4b8a-be69-f0b874735531" (UID: "6b21a245-8e31-4b8a-be69-f0b874735531"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.499695 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config" (OuterVolumeSpecName: "config") pod "6b21a245-8e31-4b8a-be69-f0b874735531" (UID: "6b21a245-8e31-4b8a-be69-f0b874735531"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.501262 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config" (OuterVolumeSpecName: "config") pod "7ae6d556-e120-4432-82c1-b809d98ec9c9" (UID: "7ae6d556-e120-4432-82c1-b809d98ec9c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.509806 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p" (OuterVolumeSpecName: "kube-api-access-dlm5p") pod "6b21a245-8e31-4b8a-be69-f0b874735531" (UID: "6b21a245-8e31-4b8a-be69-f0b874735531"). InnerVolumeSpecName "kube-api-access-dlm5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.509855 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6" (OuterVolumeSpecName: "kube-api-access-cksh6") pod "7ae6d556-e120-4432-82c1-b809d98ec9c9" (UID: "7ae6d556-e120-4432-82c1-b809d98ec9c9"). InnerVolumeSpecName "kube-api-access-cksh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.514361 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xhwkc"] Oct 07 17:20:49 crc kubenswrapper[4681]: W1007 17:20:49.525592 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5d2debf_c5bb_47fa_9d33_69c2f549a3e0.slice/crio-171c5561239672d63d96c3dca0b47a2173b728e258d5aa3caa32792ca96dc35d WatchSource:0}: Error finding container 171c5561239672d63d96c3dca0b47a2173b728e258d5aa3caa32792ca96dc35d: Status 404 returned error can't find the container with id 171c5561239672d63d96c3dca0b47a2173b728e258d5aa3caa32792ca96dc35d Oct 07 17:20:49 crc kubenswrapper[4681]: W1007 17:20:49.536716 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be45f14_7feb_40fa_a0a8_919c6d8cd052.slice/crio-9abf2851ec839ad7b6af2db6ec6880b5e9087536597d754a13263a063d4cebd5 WatchSource:0}: Error finding container 9abf2851ec839ad7b6af2db6ec6880b5e9087536597d754a13263a063d4cebd5: Status 404 returned error can't find the container with id 9abf2851ec839ad7b6af2db6ec6880b5e9087536597d754a13263a063d4cebd5 Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.571112 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6tf88"] Oct 07 17:20:49 crc kubenswrapper[4681]: W1007 17:20:49.574484 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a172508_6850_4bf5_8e7f_6c6674c4a1ee.slice/crio-d668e1bccaac8c0c9215ef494e0c9178df2c54dee891dfd01effd30dbf3ab23b WatchSource:0}: Error finding container d668e1bccaac8c0c9215ef494e0c9178df2c54dee891dfd01effd30dbf3ab23b: Status 404 returned error can't find the container with id d668e1bccaac8c0c9215ef494e0c9178df2c54dee891dfd01effd30dbf3ab23b Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.601874 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksh6\" (UniqueName: \"kubernetes.io/projected/7ae6d556-e120-4432-82c1-b809d98ec9c9-kube-api-access-cksh6\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.601927 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlm5p\" (UniqueName: \"kubernetes.io/projected/6b21a245-8e31-4b8a-be69-f0b874735531-kube-api-access-dlm5p\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.601943 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.601966 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae6d556-e120-4432-82c1-b809d98ec9c9-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.602044 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b21a245-8e31-4b8a-be69-f0b874735531-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.663642 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 17:20:49 crc kubenswrapper[4681]: W1007 17:20:49.667311 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9de0f04a_f2ed_48ee_a873_8a02b70fb146.slice/crio-2a400b54ec8f9b85db4727bba0be8244035296a86df5337cc8b929aa73ff73bc WatchSource:0}: Error finding container 2a400b54ec8f9b85db4727bba0be8244035296a86df5337cc8b929aa73ff73bc: Status 404 returned error can't find the container with id 2a400b54ec8f9b85db4727bba0be8244035296a86df5337cc8b929aa73ff73bc Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.818073 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"61391679-2b8c-4be3-b3d7-bd2d3e667c15","Type":"ContainerStarted","Data":"b1b64a2e2c2b7d29065cb2f1733d6bc22dfcf3ecad2a46a65d0e2fd7e22b80ab"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.819037 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xhwkc" event={"ID":"8be45f14-7feb-40fa-a0a8-919c6d8cd052","Type":"ContainerStarted","Data":"9abf2851ec839ad7b6af2db6ec6880b5e9087536597d754a13263a063d4cebd5"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.819887 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6tf88" event={"ID":"6a172508-6850-4bf5-8e7f-6c6674c4a1ee","Type":"ContainerStarted","Data":"d668e1bccaac8c0c9215ef494e0c9178df2c54dee891dfd01effd30dbf3ab23b"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.822687 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d261af7-bc67-4638-8b4c-1f7a7cb129a2","Type":"ContainerStarted","Data":"4dc59f5bde50de695742dd402449e6bed53070bb708c81ea997c34e4fa508cdd"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.824686 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9de0f04a-f2ed-48ee-a873-8a02b70fb146","Type":"ContainerStarted","Data":"2a400b54ec8f9b85db4727bba0be8244035296a86df5337cc8b929aa73ff73bc"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.825611 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.825645 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2p5z8" event={"ID":"6b21a245-8e31-4b8a-be69-f0b874735531","Type":"ContainerDied","Data":"3c6757adce13418b062bca5421ed400286aded94eacba4d022b94e1897e151ae"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.826406 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0","Type":"ContainerStarted","Data":"171c5561239672d63d96c3dca0b47a2173b728e258d5aa3caa32792ca96dc35d"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.827731 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" event={"ID":"7ae6d556-e120-4432-82c1-b809d98ec9c9","Type":"ContainerDied","Data":"dd72d298da3df8eae16b6542f7f9df71a5c8e7f550a4e3aa3196deeb633f0ff1"} Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.827834 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cm2kn" Oct 07 17:20:49 crc kubenswrapper[4681]: E1007 17:20:49.828543 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.924172 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.945546 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cm2kn"] Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.972569 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:49 crc kubenswrapper[4681]: I1007 17:20:49.980746 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2p5z8"] Oct 07 17:20:51 crc kubenswrapper[4681]: I1007 17:20:51.038931 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b21a245-8e31-4b8a-be69-f0b874735531" path="/var/lib/kubelet/pods/6b21a245-8e31-4b8a-be69-f0b874735531/volumes" Oct 07 17:20:51 crc kubenswrapper[4681]: I1007 17:20:51.039419 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae6d556-e120-4432-82c1-b809d98ec9c9" path="/var/lib/kubelet/pods/7ae6d556-e120-4432-82c1-b809d98ec9c9/volumes" Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.858741 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xhwkc" event={"ID":"8be45f14-7feb-40fa-a0a8-919c6d8cd052","Type":"ContainerStarted","Data":"9d227d80f3971f1f567a0d8be1ab329d415a8104dde1426ae65958db85c04dce"} Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.859346 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xhwkc" Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.860695 4681 generic.go:334] "Generic (PLEG): container finished" podID="6a172508-6850-4bf5-8e7f-6c6674c4a1ee" containerID="e35b9fb62aece58cd018ab5081a5412dc42807cb7f787181d10d5ddbc3d9a293" exitCode=0 Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.860778 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6tf88" event={"ID":"6a172508-6850-4bf5-8e7f-6c6674c4a1ee","Type":"ContainerDied","Data":"e35b9fb62aece58cd018ab5081a5412dc42807cb7f787181d10d5ddbc3d9a293"} Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.864565 4681 generic.go:334] "Generic (PLEG): container finished" podID="7d261af7-bc67-4638-8b4c-1f7a7cb129a2" containerID="4dc59f5bde50de695742dd402449e6bed53070bb708c81ea997c34e4fa508cdd" exitCode=0 Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.864617 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d261af7-bc67-4638-8b4c-1f7a7cb129a2","Type":"ContainerDied","Data":"4dc59f5bde50de695742dd402449e6bed53070bb708c81ea997c34e4fa508cdd"} Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.871996 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9de0f04a-f2ed-48ee-a873-8a02b70fb146","Type":"ContainerStarted","Data":"45aa9a9098afee8727b54218d2e287c47224ff93ab656dc6974c6a86492d2ca3"} Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.901636 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xhwkc" podStartSLOduration=32.404214477 podStartE2EDuration="35.901617795s" podCreationTimestamp="2025-10-07 17:20:18 +0000 UTC" firstStartedPulling="2025-10-07 17:20:49.550429056 +0000 UTC m=+1053.197840611" lastFinishedPulling="2025-10-07 17:20:53.047832384 +0000 UTC m=+1056.695243929" observedRunningTime="2025-10-07 17:20:53.876285588 +0000 UTC m=+1057.523697143" watchObservedRunningTime="2025-10-07 17:20:53.901617795 +0000 UTC m=+1057.549029350" Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.922142 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0","Type":"ContainerStarted","Data":"06f500a48ebbdfd224e2a36d3bae191f54115b9c64b6602270d84e036fc587b1"} Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.930616 4681 generic.go:334] "Generic (PLEG): container finished" podID="61391679-2b8c-4be3-b3d7-bd2d3e667c15" containerID="b1b64a2e2c2b7d29065cb2f1733d6bc22dfcf3ecad2a46a65d0e2fd7e22b80ab" exitCode=0 Oct 07 17:20:53 crc kubenswrapper[4681]: I1007 17:20:53.930672 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"61391679-2b8c-4be3-b3d7-bd2d3e667c15","Type":"ContainerDied","Data":"b1b64a2e2c2b7d29065cb2f1733d6bc22dfcf3ecad2a46a65d0e2fd7e22b80ab"} Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.942218 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7d261af7-bc67-4638-8b4c-1f7a7cb129a2","Type":"ContainerStarted","Data":"68664ccd8aa6652ee242904ec28f2aa513d6b6cf2ce231079901fd2da3eab063"} Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.954555 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"61391679-2b8c-4be3-b3d7-bd2d3e667c15","Type":"ContainerStarted","Data":"84188f00b7f64cb8489a907f36933434ea3aea343d4d162db4707816a611ec58"} Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.958077 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6tf88" event={"ID":"6a172508-6850-4bf5-8e7f-6c6674c4a1ee","Type":"ContainerStarted","Data":"c9e14cb380598d35017725ed0729b595b5383fbe085ac70f91b3870e1150ee41"} Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.958102 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6tf88" event={"ID":"6a172508-6850-4bf5-8e7f-6c6674c4a1ee","Type":"ContainerStarted","Data":"909d094bae4000e756e655db0c8fc5c1a892971a453c1b22d4340eef0ae6e7c8"} Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.958459 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.958530 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.970958 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.468847993 podStartE2EDuration="44.970942791s" podCreationTimestamp="2025-10-07 17:20:10 +0000 UTC" firstStartedPulling="2025-10-07 17:20:13.069988561 +0000 UTC m=+1016.717400116" lastFinishedPulling="2025-10-07 17:20:48.572083349 +0000 UTC m=+1052.219494914" observedRunningTime="2025-10-07 17:20:54.961826577 +0000 UTC m=+1058.609238132" watchObservedRunningTime="2025-10-07 17:20:54.970942791 +0000 UTC m=+1058.618354336" Oct 07 17:20:54 crc kubenswrapper[4681]: I1007 17:20:54.989708 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6tf88" podStartSLOduration=33.544814213 podStartE2EDuration="36.989694085s" podCreationTimestamp="2025-10-07 17:20:18 +0000 UTC" firstStartedPulling="2025-10-07 17:20:49.578648834 +0000 UTC m=+1053.226060389" lastFinishedPulling="2025-10-07 17:20:53.023528706 +0000 UTC m=+1056.670940261" observedRunningTime="2025-10-07 17:20:54.984664014 +0000 UTC m=+1058.632075569" watchObservedRunningTime="2025-10-07 17:20:54.989694085 +0000 UTC m=+1058.637105640" Oct 07 17:20:55 crc kubenswrapper[4681]: I1007 17:20:55.009170 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371992.845627 podStartE2EDuration="44.009149368s" podCreationTimestamp="2025-10-07 17:20:11 +0000 UTC" firstStartedPulling="2025-10-07 17:20:14.634017095 +0000 UTC m=+1018.281428650" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:20:55.008657904 +0000 UTC m=+1058.656069479" watchObservedRunningTime="2025-10-07 17:20:55.009149368 +0000 UTC m=+1058.656560923" Oct 07 17:20:56 crc kubenswrapper[4681]: I1007 17:20:56.971664 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9de0f04a-f2ed-48ee-a873-8a02b70fb146","Type":"ContainerStarted","Data":"11c8ec6874e58c7509868ed1fe4633500860aeb664fa8cf4bdc33bab3da0fa53"} Oct 07 17:20:56 crc kubenswrapper[4681]: I1007 17:20:56.974205 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d5d2debf-c5bb-47fa-9d33-69c2f549a3e0","Type":"ContainerStarted","Data":"842c196e0f9518745dc3dda7b8eee7404783127f3f894bfd6ee56c42e99cfbaa"} Oct 07 17:20:56 crc kubenswrapper[4681]: I1007 17:20:56.976933 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1aa4b182-3cf3-4e5d-b59d-5e00004cb912","Type":"ContainerStarted","Data":"334b943e1e0cfadd751b2e2ac91a5b40c9cf34885d9becbf8e8fd883cc77ccad"} Oct 07 17:20:56 crc kubenswrapper[4681]: I1007 17:20:56.977115 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 17:20:56 crc kubenswrapper[4681]: I1007 17:20:56.996845 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.353139152 podStartE2EDuration="37.996822957s" podCreationTimestamp="2025-10-07 17:20:19 +0000 UTC" firstStartedPulling="2025-10-07 17:20:49.669812319 +0000 UTC m=+1053.317223874" lastFinishedPulling="2025-10-07 17:20:56.313496124 +0000 UTC m=+1059.960907679" observedRunningTime="2025-10-07 17:20:56.989702448 +0000 UTC m=+1060.637114013" watchObservedRunningTime="2025-10-07 17:20:56.996822957 +0000 UTC m=+1060.644234512" Oct 07 17:20:57 crc kubenswrapper[4681]: I1007 17:20:57.007483 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.7409134 podStartE2EDuration="44.007467084s" podCreationTimestamp="2025-10-07 17:20:13 +0000 UTC" firstStartedPulling="2025-10-07 17:20:15.217423678 +0000 UTC m=+1018.864835233" lastFinishedPulling="2025-10-07 17:20:56.483977362 +0000 UTC m=+1060.131388917" observedRunningTime="2025-10-07 17:20:57.004193453 +0000 UTC m=+1060.651605018" watchObservedRunningTime="2025-10-07 17:20:57.007467084 +0000 UTC m=+1060.654878639" Oct 07 17:20:57 crc kubenswrapper[4681]: I1007 17:20:57.025203 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=29.231492325 podStartE2EDuration="36.025187748s" podCreationTimestamp="2025-10-07 17:20:21 +0000 UTC" firstStartedPulling="2025-10-07 17:20:49.53623208 +0000 UTC m=+1053.183643635" lastFinishedPulling="2025-10-07 17:20:56.329927503 +0000 UTC m=+1059.977339058" observedRunningTime="2025-10-07 17:20:57.022412131 +0000 UTC m=+1060.669823686" watchObservedRunningTime="2025-10-07 17:20:57.025187748 +0000 UTC m=+1060.672599303" Oct 07 17:20:57 crc kubenswrapper[4681]: I1007 17:20:57.669278 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:57 crc kubenswrapper[4681]: I1007 17:20:57.984837 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerStarted","Data":"684797f48d112b934082488598634aba95ab60ac31835fe41f631c9666e298a5"} Oct 07 17:20:58 crc kubenswrapper[4681]: I1007 17:20:58.668651 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:58 crc kubenswrapper[4681]: I1007 17:20:58.707870 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.003372 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerStarted","Data":"63784714c0d4bc78a8ead0376932527f23f2511b675cec54cfcb3226d4bdd559"} Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.060465 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.329658 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.376843 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.378331 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.380006 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.398410 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.492100 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-v4f4x"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.493288 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.501706 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-v4f4x"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.502839 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.571983 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpjc\" (UniqueName: \"kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.572040 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.572082 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.572106 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676689 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676733 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlt8n\" (UniqueName: \"kubernetes.io/projected/361da154-8a78-497d-9bb1-78335f5a286d-kube-api-access-hlt8n\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676779 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovs-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676811 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676850 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361da154-8a78-497d-9bb1-78335f5a286d-config\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676872 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-combined-ca-bundle\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpjc\" (UniqueName: \"kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676956 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.676979 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovn-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.678712 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.679308 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.679342 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.699047 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpjc\" (UniqueName: \"kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc\") pod \"dnsmasq-dns-6bc7876d45-5cgq9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.699558 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.772311 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778318 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlt8n\" (UniqueName: \"kubernetes.io/projected/361da154-8a78-497d-9bb1-78335f5a286d-kube-api-access-hlt8n\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778388 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovs-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778431 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778472 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361da154-8a78-497d-9bb1-78335f5a286d-config\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-combined-ca-bundle\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778544 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovn-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778787 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovs-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.778789 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/361da154-8a78-497d-9bb1-78335f5a286d-ovn-rundir\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.779517 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361da154-8a78-497d-9bb1-78335f5a286d-config\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.783718 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.787603 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361da154-8a78-497d-9bb1-78335f5a286d-combined-ca-bundle\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.802551 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.823816 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlt8n\" (UniqueName: \"kubernetes.io/projected/361da154-8a78-497d-9bb1-78335f5a286d-kube-api-access-hlt8n\") pod \"ovn-controller-metrics-v4f4x\" (UID: \"361da154-8a78-497d-9bb1-78335f5a286d\") " pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.869247 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-v4f4x" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.880409 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config\") pod \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.880484 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hsrs\" (UniqueName: \"kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs\") pod \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.880566 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc\") pod \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\" (UID: \"d94b9739-e609-4d7c-a7f2-813b90d33fdd\") " Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.881067 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config" (OuterVolumeSpecName: "config") pod "d94b9739-e609-4d7c-a7f2-813b90d33fdd" (UID: "d94b9739-e609-4d7c-a7f2-813b90d33fdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.881253 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.881796 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d94b9739-e609-4d7c-a7f2-813b90d33fdd" (UID: "d94b9739-e609-4d7c-a7f2-813b90d33fdd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.907818 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs" (OuterVolumeSpecName: "kube-api-access-9hsrs") pod "d94b9739-e609-4d7c-a7f2-813b90d33fdd" (UID: "d94b9739-e609-4d7c-a7f2-813b90d33fdd"). InnerVolumeSpecName "kube-api-access-9hsrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.925646 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.952614 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.954000 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.960258 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.982286 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.983470 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hsrs\" (UniqueName: \"kubernetes.io/projected/d94b9739-e609-4d7c-a7f2-813b90d33fdd-kube-api-access-9hsrs\") on node \"crc\" DevicePath \"\"" Oct 07 17:20:59 crc kubenswrapper[4681]: I1007 17:20:59.983491 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94b9739-e609-4d7c-a7f2-813b90d33fdd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.010534 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.022719 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.025197 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-g2v94" event={"ID":"d94b9739-e609-4d7c-a7f2-813b90d33fdd","Type":"ContainerDied","Data":"616b8c7cb91bbb460329bbf15d09221e3532f68423f9b67b395b0a061c4f57cd"} Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.026183 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.097073 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.097112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.097137 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.097212 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xl54\" (UniqueName: \"kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.097246 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.143184 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.177050 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-g2v94"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.177154 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.199669 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.199812 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.199840 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.199868 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.199979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xl54\" (UniqueName: \"kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.200921 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.201286 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.201567 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.202845 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.239690 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xl54\" (UniqueName: \"kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54\") pod \"dnsmasq-dns-8554648995-gvxct\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.310073 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.395678 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.399381 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:21:00 crc kubenswrapper[4681]: W1007 17:21:00.420598 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1c7e087_913d_48f2_a9ae_3a22ba9ef4b9.slice/crio-f7c22a7c1b34cff90eefdc85507dfa83b4d015444c4e52cf1992c344e6215a93 WatchSource:0}: Error finding container f7c22a7c1b34cff90eefdc85507dfa83b4d015444c4e52cf1992c344e6215a93: Status 404 returned error can't find the container with id f7c22a7c1b34cff90eefdc85507dfa83b4d015444c4e52cf1992c344e6215a93 Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.517463 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc\") pod \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.517631 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config\") pod \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.517660 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpnzw\" (UniqueName: \"kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw\") pod \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\" (UID: \"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea\") " Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.518199 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.518208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" (UID: "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.519494 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.526955 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.527182 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.527852 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.529729 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wts6p" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.540226 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config" (OuterVolumeSpecName: "config") pod "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" (UID: "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.544580 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.548847 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw" (OuterVolumeSpecName: "kube-api-access-gpnzw") pod "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" (UID: "d0afd9d9-c53a-450f-8aa1-57aa2996d9ea"). InnerVolumeSpecName "kube-api-access-gpnzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.619733 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwqq\" (UniqueName: \"kubernetes.io/projected/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-kube-api-access-gnwqq\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620171 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-config\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620266 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-scripts\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620288 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620311 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620372 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620441 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620498 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620509 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.620518 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpnzw\" (UniqueName: \"kubernetes.io/projected/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea-kube-api-access-gpnzw\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721293 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-scripts\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721335 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721352 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721381 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721398 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721437 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwqq\" (UniqueName: \"kubernetes.io/projected/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-kube-api-access-gnwqq\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.721483 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-config\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.722392 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-config\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.722565 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-scripts\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.722718 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.726735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.734396 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.738637 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-v4f4x"] Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.744424 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.750826 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwqq\" (UniqueName: \"kubernetes.io/projected/1209f82a-cbcc-4833-98f0-6e2a07b53aeb-kube-api-access-gnwqq\") pod \"ovn-northd-0\" (UID: \"1209f82a-cbcc-4833-98f0-6e2a07b53aeb\") " pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.846854 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 17:21:00 crc kubenswrapper[4681]: I1007 17:21:00.877276 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.053375 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94b9739-e609-4d7c-a7f2-813b90d33fdd" path="/var/lib/kubelet/pods/d94b9739-e609-4d7c-a7f2-813b90d33fdd/volumes" Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.055084 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-v4f4x" event={"ID":"361da154-8a78-497d-9bb1-78335f5a286d","Type":"ContainerStarted","Data":"b8743f065b61f53fe1b81dea27388a4eb52925bffb8a7dc0efdf34e7569b35b3"} Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.055631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gvxct" event={"ID":"bccb5ffe-cd60-4f08-bb97-aebd92cea802","Type":"ContainerStarted","Data":"5d94eb8d091d27b605d4da6b0e07b7ff9b65ebe3095418ef6d73b5c26dd91d45"} Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.064739 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" event={"ID":"d0afd9d9-c53a-450f-8aa1-57aa2996d9ea","Type":"ContainerDied","Data":"d4bc18a318b3b5ab3352ed27be0fb62d07e4012542a676d1ccfc17e12562c526"} Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.064831 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-9z6k8" Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.073109 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" event={"ID":"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9","Type":"ContainerStarted","Data":"f7c22a7c1b34cff90eefdc85507dfa83b4d015444c4e52cf1992c344e6215a93"} Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.124312 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.135804 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-9z6k8"] Oct 07 17:21:01 crc kubenswrapper[4681]: I1007 17:21:01.307824 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 17:21:01 crc kubenswrapper[4681]: W1007 17:21:01.308030 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1209f82a_cbcc_4833_98f0_6e2a07b53aeb.slice/crio-80d98f7d4399feff2a90f5bbe72958120603f7c124b0d9c78c31212bf15ef68e WatchSource:0}: Error finding container 80d98f7d4399feff2a90f5bbe72958120603f7c124b0d9c78c31212bf15ef68e: Status 404 returned error can't find the container with id 80d98f7d4399feff2a90f5bbe72958120603f7c124b0d9c78c31212bf15ef68e Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.083399 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1209f82a-cbcc-4833-98f0-6e2a07b53aeb","Type":"ContainerStarted","Data":"80d98f7d4399feff2a90f5bbe72958120603f7c124b0d9c78c31212bf15ef68e"} Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.085069 4681 generic.go:334] "Generic (PLEG): container finished" podID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerID="49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025" exitCode=0 Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.085145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gvxct" event={"ID":"bccb5ffe-cd60-4f08-bb97-aebd92cea802","Type":"ContainerDied","Data":"49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025"} Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.087035 4681 generic.go:334] "Generic (PLEG): container finished" podID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerID="c4605cfee6ee173eca78061a1fdcadf6a309561d7f778c81c2f47e54590196f7" exitCode=0 Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.087098 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" event={"ID":"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9","Type":"ContainerDied","Data":"c4605cfee6ee173eca78061a1fdcadf6a309561d7f778c81c2f47e54590196f7"} Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.088815 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-v4f4x" event={"ID":"361da154-8a78-497d-9bb1-78335f5a286d","Type":"ContainerStarted","Data":"a8f036eebd0fdf0c9cc90061a45cb58fb4f8466f02045113b87046349845dea2"} Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.285166 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.285214 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.362088 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 17:21:02 crc kubenswrapper[4681]: I1007 17:21:02.382957 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-v4f4x" podStartSLOduration=3.382932762 podStartE2EDuration="3.382932762s" podCreationTimestamp="2025-10-07 17:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:21:02.154049373 +0000 UTC m=+1065.801460938" watchObservedRunningTime="2025-10-07 17:21:02.382932762 +0000 UTC m=+1066.030344327" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.038835 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0afd9d9-c53a-450f-8aa1-57aa2996d9ea" path="/var/lib/kubelet/pods/d0afd9d9-c53a-450f-8aa1-57aa2996d9ea/volumes" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.102704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" event={"ID":"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9","Type":"ContainerStarted","Data":"56ff8c44d117b9cd019f1c6e8bf820bfb4659b4831e21647bab5726f91885bb0"} Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.107196 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gvxct" event={"ID":"bccb5ffe-cd60-4f08-bb97-aebd92cea802","Type":"ContainerStarted","Data":"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593"} Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.148931 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.253058 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.253100 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.298582 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.483033 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vl7gx"] Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.484051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.496546 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vl7gx"] Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.568243 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5phj\" (UniqueName: \"kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj\") pod \"keystone-db-create-vl7gx\" (UID: \"d4fdd3f9-ef24-465f-96f0-7c09c34124b4\") " pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.670277 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5phj\" (UniqueName: \"kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj\") pod \"keystone-db-create-vl7gx\" (UID: \"d4fdd3f9-ef24-465f-96f0-7c09c34124b4\") " pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.689014 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5phj\" (UniqueName: \"kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj\") pod \"keystone-db-create-vl7gx\" (UID: \"d4fdd3f9-ef24-465f-96f0-7c09c34124b4\") " pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.723565 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xrvzv"] Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.724732 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.740537 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrvzv"] Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.772340 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrktv\" (UniqueName: \"kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv\") pod \"placement-db-create-xrvzv\" (UID: \"503e25f5-b736-418a-b51a-7f5f2ee82ba8\") " pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.800183 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.846032 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.874167 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrktv\" (UniqueName: \"kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv\") pod \"placement-db-create-xrvzv\" (UID: \"503e25f5-b736-418a-b51a-7f5f2ee82ba8\") " pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:03 crc kubenswrapper[4681]: I1007 17:21:03.894440 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrktv\" (UniqueName: \"kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv\") pod \"placement-db-create-xrvzv\" (UID: \"503e25f5-b736-418a-b51a-7f5f2ee82ba8\") " pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.051192 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.095677 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rdh4c"] Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.102967 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.137644 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rdh4c"] Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.181273 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rll78\" (UniqueName: \"kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78\") pod \"glance-db-create-rdh4c\" (UID: \"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4\") " pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.214142 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.283102 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rll78\" (UniqueName: \"kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78\") pod \"glance-db-create-rdh4c\" (UID: \"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4\") " pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.306126 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rll78\" (UniqueName: \"kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78\") pod \"glance-db-create-rdh4c\" (UID: \"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4\") " pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.331272 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vl7gx"] Oct 07 17:21:04 crc kubenswrapper[4681]: W1007 17:21:04.336062 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4fdd3f9_ef24_465f_96f0_7c09c34124b4.slice/crio-613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c WatchSource:0}: Error finding container 613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c: Status 404 returned error can't find the container with id 613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.447474 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.528534 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrvzv"] Oct 07 17:21:04 crc kubenswrapper[4681]: W1007 17:21:04.530771 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod503e25f5_b736_418a_b51a_7f5f2ee82ba8.slice/crio-a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f WatchSource:0}: Error finding container a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f: Status 404 returned error can't find the container with id a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f Oct 07 17:21:04 crc kubenswrapper[4681]: W1007 17:21:04.766517 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5473d55d_7c8b_4e5a_ad3c_0b30d31ee9b4.slice/crio-e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80 WatchSource:0}: Error finding container e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80: Status 404 returned error can't find the container with id e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80 Oct 07 17:21:04 crc kubenswrapper[4681]: I1007 17:21:04.767334 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rdh4c"] Oct 07 17:21:05 crc kubenswrapper[4681]: I1007 17:21:05.159144 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rdh4c" event={"ID":"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4","Type":"ContainerStarted","Data":"e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80"} Oct 07 17:21:05 crc kubenswrapper[4681]: I1007 17:21:05.160763 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vl7gx" event={"ID":"d4fdd3f9-ef24-465f-96f0-7c09c34124b4","Type":"ContainerStarted","Data":"613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c"} Oct 07 17:21:05 crc kubenswrapper[4681]: I1007 17:21:05.161690 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrvzv" event={"ID":"503e25f5-b736-418a-b51a-7f5f2ee82ba8","Type":"ContainerStarted","Data":"a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f"} Oct 07 17:21:05 crc kubenswrapper[4681]: I1007 17:21:05.162299 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:21:05 crc kubenswrapper[4681]: I1007 17:21:05.182644 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" podStartSLOduration=5.61311922 podStartE2EDuration="6.182622405s" podCreationTimestamp="2025-10-07 17:20:59 +0000 UTC" firstStartedPulling="2025-10-07 17:21:00.424546981 +0000 UTC m=+1064.071958536" lastFinishedPulling="2025-10-07 17:21:00.994050166 +0000 UTC m=+1064.641461721" observedRunningTime="2025-10-07 17:21:05.179478738 +0000 UTC m=+1068.826890323" watchObservedRunningTime="2025-10-07 17:21:05.182622405 +0000 UTC m=+1068.830033960" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.151491 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.197863 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.199227 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.284381 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.314752 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.314810 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.314864 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.314961 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2dtq\" (UniqueName: \"kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.315013 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.416806 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.416914 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2dtq\" (UniqueName: \"kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.416970 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.416992 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.417025 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.417802 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.417840 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.418128 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.418284 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.453221 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2dtq\" (UniqueName: \"kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq\") pod \"dnsmasq-dns-b8fbc5445-mjmjw\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.525720 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:06 crc kubenswrapper[4681]: I1007 17:21:06.994767 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:21:06 crc kubenswrapper[4681]: W1007 17:21:06.999354 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedde6df1_fefc_48ee_b81a_638a564a6e18.slice/crio-d7f8630e38d462f229763a7f7c3e4d885fe9547b7ce88f98e57b00f0623b58a1 WatchSource:0}: Error finding container d7f8630e38d462f229763a7f7c3e4d885fe9547b7ce88f98e57b00f0623b58a1: Status 404 returned error can't find the container with id d7f8630e38d462f229763a7f7c3e4d885fe9547b7ce88f98e57b00f0623b58a1 Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.176217 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" event={"ID":"edde6df1-fefc-48ee-b81a-638a564a6e18","Type":"ContainerStarted","Data":"d7f8630e38d462f229763a7f7c3e4d885fe9547b7ce88f98e57b00f0623b58a1"} Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.176271 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.176532 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="dnsmasq-dns" containerID="cri-o://56ff8c44d117b9cd019f1c6e8bf820bfb4659b4831e21647bab5726f91885bb0" gracePeriod=10 Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.178161 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.180035 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.204336 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-gvxct" podStartSLOduration=7.637992757 podStartE2EDuration="8.204092508s" podCreationTimestamp="2025-10-07 17:20:59 +0000 UTC" firstStartedPulling="2025-10-07 17:21:00.884096717 +0000 UTC m=+1064.531508262" lastFinishedPulling="2025-10-07 17:21:01.450196458 +0000 UTC m=+1065.097608013" observedRunningTime="2025-10-07 17:21:07.194721476 +0000 UTC m=+1070.842133041" watchObservedRunningTime="2025-10-07 17:21:07.204092508 +0000 UTC m=+1070.851504063" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.257721 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.280420 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.286295 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-fc5zz" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.286535 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.289271 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.289560 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.296920 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.434271 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-cache\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.434317 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.434573 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.434662 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-lock\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.434722 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwszh\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-kube-api-access-cwszh\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.538695 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-lock\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.538765 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwszh\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-kube-api-access-cwszh\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.538818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-cache\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.538844 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.538886 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: E1007 17:21:07.539112 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:07 crc kubenswrapper[4681]: E1007 17:21:07.539130 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:07 crc kubenswrapper[4681]: E1007 17:21:07.539183 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:08.03916444 +0000 UTC m=+1071.686575995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.539211 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-lock\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.539251 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.539279 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e111df37-d4f7-4dc5-ad9a-04b05519309a-cache\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.561949 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwszh\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-kube-api-access-cwszh\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:07 crc kubenswrapper[4681]: I1007 17:21:07.564581 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.048753 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:08 crc kubenswrapper[4681]: E1007 17:21:08.048928 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:08 crc kubenswrapper[4681]: E1007 17:21:08.049240 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:08 crc kubenswrapper[4681]: E1007 17:21:08.049298 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:09.049279708 +0000 UTC m=+1072.696691273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.212910 4681 generic.go:334] "Generic (PLEG): container finished" podID="503e25f5-b736-418a-b51a-7f5f2ee82ba8" containerID="0f5933db75402c07d0825cb64393d5e1b5794d85ac8fdfa5f281928333d5726d" exitCode=0 Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.213082 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrvzv" event={"ID":"503e25f5-b736-418a-b51a-7f5f2ee82ba8","Type":"ContainerDied","Data":"0f5933db75402c07d0825cb64393d5e1b5794d85ac8fdfa5f281928333d5726d"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.218530 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1209f82a-cbcc-4833-98f0-6e2a07b53aeb","Type":"ContainerStarted","Data":"bb926307f55d3c7467cc2caf71045091eb5141ef1535da1c003a621b524d9fcf"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.220352 4681 generic.go:334] "Generic (PLEG): container finished" podID="5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" containerID="143e054bb18ef5ded36340fc337b99fae598b619a949192a70f23a89a3e01f97" exitCode=0 Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.220404 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rdh4c" event={"ID":"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4","Type":"ContainerDied","Data":"143e054bb18ef5ded36340fc337b99fae598b619a949192a70f23a89a3e01f97"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.224265 4681 generic.go:334] "Generic (PLEG): container finished" podID="d4fdd3f9-ef24-465f-96f0-7c09c34124b4" containerID="49bc70636b15c2212e643587ae10c9c263f8fa3d0e0faa8bd87b1930dfbed9aa" exitCode=0 Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.224333 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vl7gx" event={"ID":"d4fdd3f9-ef24-465f-96f0-7c09c34124b4","Type":"ContainerDied","Data":"49bc70636b15c2212e643587ae10c9c263f8fa3d0e0faa8bd87b1930dfbed9aa"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.238679 4681 generic.go:334] "Generic (PLEG): container finished" podID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerID="9772760a33fb2cf57ae6cb84e66adf5abc975b17eca4478318dc5ffca81a0c0c" exitCode=0 Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.238755 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" event={"ID":"edde6df1-fefc-48ee-b81a-638a564a6e18","Type":"ContainerDied","Data":"9772760a33fb2cf57ae6cb84e66adf5abc975b17eca4478318dc5ffca81a0c0c"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.254918 4681 generic.go:334] "Generic (PLEG): container finished" podID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerID="56ff8c44d117b9cd019f1c6e8bf820bfb4659b4831e21647bab5726f91885bb0" exitCode=0 Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.254957 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" event={"ID":"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9","Type":"ContainerDied","Data":"56ff8c44d117b9cd019f1c6e8bf820bfb4659b4831e21647bab5726f91885bb0"} Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.407017 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.562676 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bpjc\" (UniqueName: \"kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc\") pod \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.562777 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config\") pod \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.562828 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb\") pod \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.562912 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc\") pod \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\" (UID: \"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9\") " Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.573131 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc" (OuterVolumeSpecName: "kube-api-access-7bpjc") pod "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" (UID: "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9"). InnerVolumeSpecName "kube-api-access-7bpjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.618358 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" (UID: "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.639361 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" (UID: "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.642839 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config" (OuterVolumeSpecName: "config") pod "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" (UID: "a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.665647 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bpjc\" (UniqueName: \"kubernetes.io/projected/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-kube-api-access-7bpjc\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.666180 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.666277 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:08 crc kubenswrapper[4681]: I1007 17:21:08.666339 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.078699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:09 crc kubenswrapper[4681]: E1007 17:21:09.078851 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:09 crc kubenswrapper[4681]: E1007 17:21:09.079094 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:09 crc kubenswrapper[4681]: E1007 17:21:09.079178 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:11.079157224 +0000 UTC m=+1074.726568779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.267202 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" event={"ID":"edde6df1-fefc-48ee-b81a-638a564a6e18","Type":"ContainerStarted","Data":"875624b1bccd2198c9b96d0c8953e0d10435738f346d156531023242e1ad19c0"} Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.268122 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.277416 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" event={"ID":"a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9","Type":"ContainerDied","Data":"f7c22a7c1b34cff90eefdc85507dfa83b4d015444c4e52cf1992c344e6215a93"} Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.277482 4681 scope.go:117] "RemoveContainer" containerID="56ff8c44d117b9cd019f1c6e8bf820bfb4659b4831e21647bab5726f91885bb0" Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.277598 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-5cgq9" Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.290840 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1209f82a-cbcc-4833-98f0-6e2a07b53aeb","Type":"ContainerStarted","Data":"7a07a9eb16d754253ae3acc6c6a73068056806e0d153ee5cc1e398aeed02e897"} Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.306882 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" podStartSLOduration=3.306858839 podStartE2EDuration="3.306858839s" podCreationTimestamp="2025-10-07 17:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:21:09.302102517 +0000 UTC m=+1072.949514102" watchObservedRunningTime="2025-10-07 17:21:09.306858839 +0000 UTC m=+1072.954270394" Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.336862 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.351263 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-5cgq9"] Oct 07 17:21:09 crc kubenswrapper[4681]: I1007 17:21:09.361008 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.75221784 podStartE2EDuration="9.360992951s" podCreationTimestamp="2025-10-07 17:21:00 +0000 UTC" firstStartedPulling="2025-10-07 17:21:01.310747545 +0000 UTC m=+1064.958159101" lastFinishedPulling="2025-10-07 17:21:07.919522657 +0000 UTC m=+1071.566934212" observedRunningTime="2025-10-07 17:21:09.360320361 +0000 UTC m=+1073.007731906" watchObservedRunningTime="2025-10-07 17:21:09.360992951 +0000 UTC m=+1073.008404506" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.036060 4681 scope.go:117] "RemoveContainer" containerID="c4605cfee6ee173eca78061a1fdcadf6a309561d7f778c81c2f47e54590196f7" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.187995 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.211335 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.218689 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.300654 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrktv\" (UniqueName: \"kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv\") pod \"503e25f5-b736-418a-b51a-7f5f2ee82ba8\" (UID: \"503e25f5-b736-418a-b51a-7f5f2ee82ba8\") " Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.300709 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rll78\" (UniqueName: \"kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78\") pod \"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4\" (UID: \"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4\") " Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.311488 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv" (OuterVolumeSpecName: "kube-api-access-lrktv") pod "503e25f5-b736-418a-b51a-7f5f2ee82ba8" (UID: "503e25f5-b736-418a-b51a-7f5f2ee82ba8"). InnerVolumeSpecName "kube-api-access-lrktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.318149 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vl7gx" event={"ID":"d4fdd3f9-ef24-465f-96f0-7c09c34124b4","Type":"ContainerDied","Data":"613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c"} Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.318181 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613de4c2a309127a21869bbf34edb67a252b1a1739c128063e836ae92a7d3d3c" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.318485 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vl7gx" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.320141 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78" (OuterVolumeSpecName: "kube-api-access-rll78") pod "5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" (UID: "5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4"). InnerVolumeSpecName "kube-api-access-rll78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.332928 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrvzv" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.333657 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrvzv" event={"ID":"503e25f5-b736-418a-b51a-7f5f2ee82ba8","Type":"ContainerDied","Data":"a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f"} Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.333689 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7a9a4b44291714e64d136b11bbafeb45018a1f84af6213e5f0492a0a8f3180f" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.339395 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rdh4c" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.339957 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rdh4c" event={"ID":"5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4","Type":"ContainerDied","Data":"e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80"} Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.339987 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84b60675e6b0c50200c8c73dd9775dc1838555b1d458b7f4060d83f17f41b80" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.340007 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.403760 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5phj\" (UniqueName: \"kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj\") pod \"d4fdd3f9-ef24-465f-96f0-7c09c34124b4\" (UID: \"d4fdd3f9-ef24-465f-96f0-7c09c34124b4\") " Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.404226 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrktv\" (UniqueName: \"kubernetes.io/projected/503e25f5-b736-418a-b51a-7f5f2ee82ba8-kube-api-access-lrktv\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.404249 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rll78\" (UniqueName: \"kubernetes.io/projected/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4-kube-api-access-rll78\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.406828 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj" (OuterVolumeSpecName: "kube-api-access-g5phj") pod "d4fdd3f9-ef24-465f-96f0-7c09c34124b4" (UID: "d4fdd3f9-ef24-465f-96f0-7c09c34124b4"). InnerVolumeSpecName "kube-api-access-g5phj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:10 crc kubenswrapper[4681]: I1007 17:21:10.506640 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5phj\" (UniqueName: \"kubernetes.io/projected/d4fdd3f9-ef24-465f-96f0-7c09c34124b4-kube-api-access-g5phj\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.042384 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" path="/var/lib/kubelet/pods/a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9/volumes" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.116008 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.116188 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.116203 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.116255 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:15.116239612 +0000 UTC m=+1078.763651167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.194785 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-npftw"] Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.195166 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195183 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.195198 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4fdd3f9-ef24-465f-96f0-7c09c34124b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195204 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4fdd3f9-ef24-465f-96f0-7c09c34124b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.195228 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="init" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195236 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="init" Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.195249 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="dnsmasq-dns" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195257 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="dnsmasq-dns" Oct 07 17:21:11 crc kubenswrapper[4681]: E1007 17:21:11.195272 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503e25f5-b736-418a-b51a-7f5f2ee82ba8" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195278 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="503e25f5-b736-418a-b51a-7f5f2ee82ba8" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195491 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="503e25f5-b736-418a-b51a-7f5f2ee82ba8" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195506 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c7e087-913d-48f2-a9ae-3a22ba9ef4b9" containerName="dnsmasq-dns" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195517 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4fdd3f9-ef24-465f-96f0-7c09c34124b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.195526 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" containerName="mariadb-database-create" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.196168 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.198231 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.198711 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.204634 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.266300 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-npftw"] Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.319986 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320400 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjbkr\" (UniqueName: \"kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320491 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320533 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320549 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320591 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.320632 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.346819 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c0d090-0803-4fff-a9a3-9b41529b8a23","Type":"ContainerStarted","Data":"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91"} Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.347263 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.369410 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.922143513 podStartE2EDuration="56.369387448s" podCreationTimestamp="2025-10-07 17:20:15 +0000 UTC" firstStartedPulling="2025-10-07 17:20:16.638431442 +0000 UTC m=+1020.285842997" lastFinishedPulling="2025-10-07 17:21:10.085675377 +0000 UTC m=+1073.733086932" observedRunningTime="2025-10-07 17:21:11.360427067 +0000 UTC m=+1075.007838622" watchObservedRunningTime="2025-10-07 17:21:11.369387448 +0000 UTC m=+1075.016799003" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.421809 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.421870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjbkr\" (UniqueName: \"kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.421981 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.422038 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.422060 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.422119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.422146 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.422511 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.423267 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.423358 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.428361 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.428518 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.429668 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.444761 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjbkr\" (UniqueName: \"kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr\") pod \"swift-ring-rebalance-npftw\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:11 crc kubenswrapper[4681]: I1007 17:21:11.511412 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:12 crc kubenswrapper[4681]: I1007 17:21:12.003076 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-npftw"] Oct 07 17:21:12 crc kubenswrapper[4681]: I1007 17:21:12.195733 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:21:12 crc kubenswrapper[4681]: I1007 17:21:12.195811 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:21:12 crc kubenswrapper[4681]: I1007 17:21:12.356000 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-npftw" event={"ID":"51ee2d02-f1ea-4e04-817a-c08925a2078d","Type":"ContainerStarted","Data":"51b8f145216333b112db48506eae0d35e3001473a8bea07479d70251e1229d2b"} Oct 07 17:21:15 crc kubenswrapper[4681]: I1007 17:21:15.206003 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:15 crc kubenswrapper[4681]: E1007 17:21:15.206243 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:15 crc kubenswrapper[4681]: E1007 17:21:15.206390 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:15 crc kubenswrapper[4681]: E1007 17:21:15.206453 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:23.206433326 +0000 UTC m=+1086.853844881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:15 crc kubenswrapper[4681]: I1007 17:21:15.992779 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 17:21:16 crc kubenswrapper[4681]: I1007 17:21:16.527058 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:21:16 crc kubenswrapper[4681]: I1007 17:21:16.587348 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:21:16 crc kubenswrapper[4681]: I1007 17:21:16.587621 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-gvxct" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="dnsmasq-dns" containerID="cri-o://03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593" gracePeriod=10 Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.050648 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.135265 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xl54\" (UniqueName: \"kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54\") pod \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.135315 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb\") pod \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.135384 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb\") pod \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.135471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config\") pod \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.135567 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc\") pod \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\" (UID: \"bccb5ffe-cd60-4f08-bb97-aebd92cea802\") " Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.141906 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54" (OuterVolumeSpecName: "kube-api-access-8xl54") pod "bccb5ffe-cd60-4f08-bb97-aebd92cea802" (UID: "bccb5ffe-cd60-4f08-bb97-aebd92cea802"). InnerVolumeSpecName "kube-api-access-8xl54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.174171 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bccb5ffe-cd60-4f08-bb97-aebd92cea802" (UID: "bccb5ffe-cd60-4f08-bb97-aebd92cea802"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.179588 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bccb5ffe-cd60-4f08-bb97-aebd92cea802" (UID: "bccb5ffe-cd60-4f08-bb97-aebd92cea802"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.180753 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config" (OuterVolumeSpecName: "config") pod "bccb5ffe-cd60-4f08-bb97-aebd92cea802" (UID: "bccb5ffe-cd60-4f08-bb97-aebd92cea802"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.237778 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xl54\" (UniqueName: \"kubernetes.io/projected/bccb5ffe-cd60-4f08-bb97-aebd92cea802-kube-api-access-8xl54\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.237809 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.237817 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.237826 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.304496 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bccb5ffe-cd60-4f08-bb97-aebd92cea802" (UID: "bccb5ffe-cd60-4f08-bb97-aebd92cea802"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.339817 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bccb5ffe-cd60-4f08-bb97-aebd92cea802-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.412607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-npftw" event={"ID":"51ee2d02-f1ea-4e04-817a-c08925a2078d","Type":"ContainerStarted","Data":"b17b594cf7c8748ef3bf1b34c439c80ace7f9b25e57701e150709b7111d7dbb8"} Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.414552 4681 generic.go:334] "Generic (PLEG): container finished" podID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerID="03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593" exitCode=0 Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.414619 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gvxct" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.414637 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gvxct" event={"ID":"bccb5ffe-cd60-4f08-bb97-aebd92cea802","Type":"ContainerDied","Data":"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593"} Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.414685 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gvxct" event={"ID":"bccb5ffe-cd60-4f08-bb97-aebd92cea802","Type":"ContainerDied","Data":"5d94eb8d091d27b605d4da6b0e07b7ff9b65ebe3095418ef6d73b5c26dd91d45"} Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.414708 4681 scope.go:117] "RemoveContainer" containerID="03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.440675 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-npftw" podStartSLOduration=2.259507494 podStartE2EDuration="6.440651906s" podCreationTimestamp="2025-10-07 17:21:11 +0000 UTC" firstStartedPulling="2025-10-07 17:21:11.996639316 +0000 UTC m=+1075.644050871" lastFinishedPulling="2025-10-07 17:21:16.177783728 +0000 UTC m=+1079.825195283" observedRunningTime="2025-10-07 17:21:17.436246853 +0000 UTC m=+1081.083658408" watchObservedRunningTime="2025-10-07 17:21:17.440651906 +0000 UTC m=+1081.088063461" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.457947 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.484839 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gvxct"] Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.563171 4681 scope.go:117] "RemoveContainer" containerID="49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.599106 4681 scope.go:117] "RemoveContainer" containerID="03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593" Oct 07 17:21:17 crc kubenswrapper[4681]: E1007 17:21:17.602443 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593\": container with ID starting with 03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593 not found: ID does not exist" containerID="03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.602586 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593"} err="failed to get container status \"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593\": rpc error: code = NotFound desc = could not find container \"03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593\": container with ID starting with 03d7a0cde160aeb80ee9d2ef5200355a132eccea6d49872c1389fb1088950593 not found: ID does not exist" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.602679 4681 scope.go:117] "RemoveContainer" containerID="49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025" Oct 07 17:21:17 crc kubenswrapper[4681]: E1007 17:21:17.603028 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025\": container with ID starting with 49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025 not found: ID does not exist" containerID="49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025" Oct 07 17:21:17 crc kubenswrapper[4681]: I1007 17:21:17.603056 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025"} err="failed to get container status \"49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025\": rpc error: code = NotFound desc = could not find container \"49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025\": container with ID starting with 49ae91953c0ddfdfefac17b394da4d992fa915a108902e404bf21836b9b88025 not found: ID does not exist" Oct 07 17:21:19 crc kubenswrapper[4681]: I1007 17:21:19.042016 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" path="/var/lib/kubelet/pods/bccb5ffe-cd60-4f08-bb97-aebd92cea802/volumes" Oct 07 17:21:20 crc kubenswrapper[4681]: I1007 17:21:20.913206 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.234469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:23 crc kubenswrapper[4681]: E1007 17:21:23.235141 4681 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 17:21:23 crc kubenswrapper[4681]: E1007 17:21:23.235157 4681 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 17:21:23 crc kubenswrapper[4681]: E1007 17:21:23.235202 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift podName:e111df37-d4f7-4dc5-ad9a-04b05519309a nodeName:}" failed. No retries permitted until 2025-10-07 17:21:39.23518769 +0000 UTC m=+1102.882599245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift") pod "swift-storage-0" (UID: "e111df37-d4f7-4dc5-ad9a-04b05519309a") : configmap "swift-ring-files" not found Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.583512 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f89-account-create-clbjp"] Oct 07 17:21:23 crc kubenswrapper[4681]: E1007 17:21:23.583930 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="dnsmasq-dns" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.583952 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="dnsmasq-dns" Oct 07 17:21:23 crc kubenswrapper[4681]: E1007 17:21:23.583979 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="init" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.583988 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="init" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.584212 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccb5ffe-cd60-4f08-bb97-aebd92cea802" containerName="dnsmasq-dns" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.584821 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.587026 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.599175 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f89-account-create-clbjp"] Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.642406 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvm7\" (UniqueName: \"kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7\") pod \"keystone-5f89-account-create-clbjp\" (UID: \"72933878-17c1-4cb1-b068-1c19741adf5d\") " pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.744599 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvm7\" (UniqueName: \"kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7\") pod \"keystone-5f89-account-create-clbjp\" (UID: \"72933878-17c1-4cb1-b068-1c19741adf5d\") " pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.767149 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvm7\" (UniqueName: \"kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7\") pod \"keystone-5f89-account-create-clbjp\" (UID: \"72933878-17c1-4cb1-b068-1c19741adf5d\") " pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.791496 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-87a1-account-create-ngqhh"] Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.792629 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.794620 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.810551 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-87a1-account-create-ngqhh"] Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.846094 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxp2\" (UniqueName: \"kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2\") pod \"placement-87a1-account-create-ngqhh\" (UID: \"5da275ce-2936-42ba-a43d-52605c4f5cb4\") " pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.851388 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xhwkc" podUID="8be45f14-7feb-40fa-a0a8-919c6d8cd052" containerName="ovn-controller" probeResult="failure" output=< Oct 07 17:21:23 crc kubenswrapper[4681]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 17:21:23 crc kubenswrapper[4681]: > Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.914678 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.948080 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxp2\" (UniqueName: \"kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2\") pod \"placement-87a1-account-create-ngqhh\" (UID: \"5da275ce-2936-42ba-a43d-52605c4f5cb4\") " pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:23 crc kubenswrapper[4681]: I1007 17:21:23.974820 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxp2\" (UniqueName: \"kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2\") pod \"placement-87a1-account-create-ngqhh\" (UID: \"5da275ce-2936-42ba-a43d-52605c4f5cb4\") " pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.129279 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.219611 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5a1b-account-create-swzjl"] Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.220533 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.222606 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.231490 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5a1b-account-create-swzjl"] Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.359214 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcb2w\" (UniqueName: \"kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w\") pod \"glance-5a1b-account-create-swzjl\" (UID: \"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf\") " pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.400145 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f89-account-create-clbjp"] Oct 07 17:21:24 crc kubenswrapper[4681]: W1007 17:21:24.404340 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72933878_17c1_4cb1_b068_1c19741adf5d.slice/crio-a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185 WatchSource:0}: Error finding container a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185: Status 404 returned error can't find the container with id a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185 Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.460711 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcb2w\" (UniqueName: \"kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w\") pod \"glance-5a1b-account-create-swzjl\" (UID: \"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf\") " pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.468091 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f89-account-create-clbjp" event={"ID":"72933878-17c1-4cb1-b068-1c19741adf5d","Type":"ContainerStarted","Data":"a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185"} Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.469792 4681 generic.go:334] "Generic (PLEG): container finished" podID="51ee2d02-f1ea-4e04-817a-c08925a2078d" containerID="b17b594cf7c8748ef3bf1b34c439c80ace7f9b25e57701e150709b7111d7dbb8" exitCode=0 Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.469825 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-npftw" event={"ID":"51ee2d02-f1ea-4e04-817a-c08925a2078d","Type":"ContainerDied","Data":"b17b594cf7c8748ef3bf1b34c439c80ace7f9b25e57701e150709b7111d7dbb8"} Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.480255 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcb2w\" (UniqueName: \"kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w\") pod \"glance-5a1b-account-create-swzjl\" (UID: \"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf\") " pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.543192 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.636077 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-87a1-account-create-ngqhh"] Oct 07 17:21:24 crc kubenswrapper[4681]: W1007 17:21:24.651421 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5da275ce_2936_42ba_a43d_52605c4f5cb4.slice/crio-1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979 WatchSource:0}: Error finding container 1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979: Status 404 returned error can't find the container with id 1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979 Oct 07 17:21:24 crc kubenswrapper[4681]: I1007 17:21:24.969844 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5a1b-account-create-swzjl"] Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.478347 4681 generic.go:334] "Generic (PLEG): container finished" podID="72933878-17c1-4cb1-b068-1c19741adf5d" containerID="f171d9d5a3264d254de50bf4743cd673d165d1fade2b26a157194a10d64259c4" exitCode=0 Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.478410 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f89-account-create-clbjp" event={"ID":"72933878-17c1-4cb1-b068-1c19741adf5d","Type":"ContainerDied","Data":"f171d9d5a3264d254de50bf4743cd673d165d1fade2b26a157194a10d64259c4"} Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.479923 4681 generic.go:334] "Generic (PLEG): container finished" podID="5da275ce-2936-42ba-a43d-52605c4f5cb4" containerID="7e52e96aa8195974c691c7d8b83a6a780971a1ce108967044f9ac2b7050c4b89" exitCode=0 Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.479996 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87a1-account-create-ngqhh" event={"ID":"5da275ce-2936-42ba-a43d-52605c4f5cb4","Type":"ContainerDied","Data":"7e52e96aa8195974c691c7d8b83a6a780971a1ce108967044f9ac2b7050c4b89"} Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.480026 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87a1-account-create-ngqhh" event={"ID":"5da275ce-2936-42ba-a43d-52605c4f5cb4","Type":"ContainerStarted","Data":"1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979"} Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.481371 4681 generic.go:334] "Generic (PLEG): container finished" podID="4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" containerID="7965398920f60875d06f0d0a3eb067e7e1fb8c8237c552fa08137e40a5387e54" exitCode=0 Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.481428 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5a1b-account-create-swzjl" event={"ID":"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf","Type":"ContainerDied","Data":"7965398920f60875d06f0d0a3eb067e7e1fb8c8237c552fa08137e40a5387e54"} Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.481493 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5a1b-account-create-swzjl" event={"ID":"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf","Type":"ContainerStarted","Data":"e8e979427b77991a21b6ce9131e31a9c985442c2825651cdb218a9b5d338c354"} Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.794946 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893399 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893538 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893571 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893597 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893718 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893755 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjbkr\" (UniqueName: \"kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.893778 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift\") pod \"51ee2d02-f1ea-4e04-817a-c08925a2078d\" (UID: \"51ee2d02-f1ea-4e04-817a-c08925a2078d\") " Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.894213 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.895345 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.937175 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.937589 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts" (OuterVolumeSpecName: "scripts") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.939202 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr" (OuterVolumeSpecName: "kube-api-access-sjbkr") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "kube-api-access-sjbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.995339 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.997078 4681 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.997182 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjbkr\" (UniqueName: \"kubernetes.io/projected/51ee2d02-f1ea-4e04-817a-c08925a2078d-kube-api-access-sjbkr\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.997250 4681 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/51ee2d02-f1ea-4e04-817a-c08925a2078d-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.997335 4681 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/51ee2d02-f1ea-4e04-817a-c08925a2078d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:25 crc kubenswrapper[4681]: I1007 17:21:25.997068 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.029065 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "51ee2d02-f1ea-4e04-817a-c08925a2078d" (UID: "51ee2d02-f1ea-4e04-817a-c08925a2078d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.105929 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.106197 4681 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/51ee2d02-f1ea-4e04-817a-c08925a2078d-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.490173 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-npftw" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.492073 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-npftw" event={"ID":"51ee2d02-f1ea-4e04-817a-c08925a2078d","Type":"ContainerDied","Data":"51b8f145216333b112db48506eae0d35e3001473a8bea07479d70251e1229d2b"} Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.492218 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b8f145216333b112db48506eae0d35e3001473a8bea07479d70251e1229d2b" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.816592 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.921359 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxp2\" (UniqueName: \"kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2\") pod \"5da275ce-2936-42ba-a43d-52605c4f5cb4\" (UID: \"5da275ce-2936-42ba-a43d-52605c4f5cb4\") " Oct 07 17:21:26 crc kubenswrapper[4681]: I1007 17:21:26.927066 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2" (OuterVolumeSpecName: "kube-api-access-ctxp2") pod "5da275ce-2936-42ba-a43d-52605c4f5cb4" (UID: "5da275ce-2936-42ba-a43d-52605c4f5cb4"). InnerVolumeSpecName "kube-api-access-ctxp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.003113 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.008515 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.023951 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctxp2\" (UniqueName: \"kubernetes.io/projected/5da275ce-2936-42ba-a43d-52605c4f5cb4-kube-api-access-ctxp2\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.125342 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzvm7\" (UniqueName: \"kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7\") pod \"72933878-17c1-4cb1-b068-1c19741adf5d\" (UID: \"72933878-17c1-4cb1-b068-1c19741adf5d\") " Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.125441 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcb2w\" (UniqueName: \"kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w\") pod \"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf\" (UID: \"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf\") " Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.138707 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7" (OuterVolumeSpecName: "kube-api-access-jzvm7") pod "72933878-17c1-4cb1-b068-1c19741adf5d" (UID: "72933878-17c1-4cb1-b068-1c19741adf5d"). InnerVolumeSpecName "kube-api-access-jzvm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.138974 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w" (OuterVolumeSpecName: "kube-api-access-lcb2w") pod "4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" (UID: "4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf"). InnerVolumeSpecName "kube-api-access-lcb2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.227309 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcb2w\" (UniqueName: \"kubernetes.io/projected/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf-kube-api-access-lcb2w\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.227344 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzvm7\" (UniqueName: \"kubernetes.io/projected/72933878-17c1-4cb1-b068-1c19741adf5d-kube-api-access-jzvm7\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.497332 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f89-account-create-clbjp" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.497352 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f89-account-create-clbjp" event={"ID":"72933878-17c1-4cb1-b068-1c19741adf5d","Type":"ContainerDied","Data":"a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185"} Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.497392 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a01557f1da0dd0b6df4942e6810cbdf635fe5dabcd72fc492c3bf201ca47e185" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.499499 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-87a1-account-create-ngqhh" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.500041 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-87a1-account-create-ngqhh" event={"ID":"5da275ce-2936-42ba-a43d-52605c4f5cb4","Type":"ContainerDied","Data":"1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979"} Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.500135 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb36ba05b91f164e456635ff5c90a7f21f39dde0ed621868e11b59fed9d3979" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.502750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5a1b-account-create-swzjl" event={"ID":"4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf","Type":"ContainerDied","Data":"e8e979427b77991a21b6ce9131e31a9c985442c2825651cdb218a9b5d338c354"} Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.502789 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5a1b-account-create-swzjl" Oct 07 17:21:27 crc kubenswrapper[4681]: I1007 17:21:27.502790 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e979427b77991a21b6ce9131e31a9c985442c2825651cdb218a9b5d338c354" Oct 07 17:21:28 crc kubenswrapper[4681]: I1007 17:21:28.844066 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xhwkc" podUID="8be45f14-7feb-40fa-a0a8-919c6d8cd052" containerName="ovn-controller" probeResult="failure" output=< Oct 07 17:21:28 crc kubenswrapper[4681]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 17:21:28 crc kubenswrapper[4681]: > Oct 07 17:21:28 crc kubenswrapper[4681]: I1007 17:21:28.936445 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:21:28 crc kubenswrapper[4681]: I1007 17:21:28.941135 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6tf88" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.164338 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xhwkc-config-5smwp"] Oct 07 17:21:29 crc kubenswrapper[4681]: E1007 17:21:29.164983 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72933878-17c1-4cb1-b068-1c19741adf5d" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165078 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="72933878-17c1-4cb1-b068-1c19741adf5d" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: E1007 17:21:29.165199 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165280 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: E1007 17:21:29.165374 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ee2d02-f1ea-4e04-817a-c08925a2078d" containerName="swift-ring-rebalance" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165449 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ee2d02-f1ea-4e04-817a-c08925a2078d" containerName="swift-ring-rebalance" Oct 07 17:21:29 crc kubenswrapper[4681]: E1007 17:21:29.165528 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da275ce-2936-42ba-a43d-52605c4f5cb4" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165603 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da275ce-2936-42ba-a43d-52605c4f5cb4" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165848 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ee2d02-f1ea-4e04-817a-c08925a2078d" containerName="swift-ring-rebalance" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.165955 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.166033 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da275ce-2936-42ba-a43d-52605c4f5cb4" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.166129 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="72933878-17c1-4cb1-b068-1c19741adf5d" containerName="mariadb-account-create" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.166896 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.168983 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.195708 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xhwkc-config-5smwp"] Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258059 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkmf2\" (UniqueName: \"kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258110 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258142 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258161 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258213 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.258281 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360090 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360212 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkmf2\" (UniqueName: \"kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360238 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360262 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360285 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360570 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.360620 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.361059 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.361075 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.361146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.362387 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.382919 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkmf2\" (UniqueName: \"kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2\") pod \"ovn-controller-xhwkc-config-5smwp\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.443318 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pgdhp"] Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.445769 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.448117 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4g6lg" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.448140 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.457500 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pgdhp"] Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.489518 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.524510 4681 generic.go:334] "Generic (PLEG): container finished" podID="44a71bcd-3178-4394-8031-673c93a6981e" containerID="684797f48d112b934082488598634aba95ab60ac31835fe41f631c9666e298a5" exitCode=0 Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.524794 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerDied","Data":"684797f48d112b934082488598634aba95ab60ac31835fe41f631c9666e298a5"} Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.570414 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.570547 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228bv\" (UniqueName: \"kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.571257 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.571563 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.672802 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.673103 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.673164 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.673201 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228bv\" (UniqueName: \"kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.679303 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.686664 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.687363 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.715154 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228bv\" (UniqueName: \"kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv\") pod \"glance-db-sync-pgdhp\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.764807 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pgdhp" Oct 07 17:21:29 crc kubenswrapper[4681]: I1007 17:21:29.788869 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xhwkc-config-5smwp"] Oct 07 17:21:29 crc kubenswrapper[4681]: W1007 17:21:29.792363 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f91efa8_3feb_4f2c_837b_c3db92c9f85d.slice/crio-75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b WatchSource:0}: Error finding container 75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b: Status 404 returned error can't find the container with id 75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.308995 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pgdhp"] Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.534770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerStarted","Data":"4b0386f421398abfcce82612a18baa1f10211504d81960f450d73cc421a70798"} Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.535325 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.536285 4681 generic.go:334] "Generic (PLEG): container finished" podID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerID="63784714c0d4bc78a8ead0376932527f23f2511b675cec54cfcb3226d4bdd559" exitCode=0 Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.536366 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerDied","Data":"63784714c0d4bc78a8ead0376932527f23f2511b675cec54cfcb3226d4bdd559"} Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.538283 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pgdhp" event={"ID":"2fd19541-0a38-4bab-bc65-ac2700770ce1","Type":"ContainerStarted","Data":"d6c0a8126b87ce7cd20e510e101d9c0238379008553cb1b814cf598df18df3e5"} Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.539853 4681 generic.go:334] "Generic (PLEG): container finished" podID="9f91efa8-3feb-4f2c-837b-c3db92c9f85d" containerID="71834050946ec519427085e9d34a98bce4e7f3aa477d5ac163e4c02b21a71def" exitCode=0 Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.539910 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xhwkc-config-5smwp" event={"ID":"9f91efa8-3feb-4f2c-837b-c3db92c9f85d","Type":"ContainerDied","Data":"71834050946ec519427085e9d34a98bce4e7f3aa477d5ac163e4c02b21a71def"} Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.539937 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xhwkc-config-5smwp" event={"ID":"9f91efa8-3feb-4f2c-837b-c3db92c9f85d","Type":"ContainerStarted","Data":"75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b"} Oct 07 17:21:30 crc kubenswrapper[4681]: I1007 17:21:30.564559 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.086235236 podStartE2EDuration="1m22.564542417s" podCreationTimestamp="2025-10-07 17:20:08 +0000 UTC" firstStartedPulling="2025-10-07 17:20:10.883742538 +0000 UTC m=+1014.531154083" lastFinishedPulling="2025-10-07 17:20:56.362049709 +0000 UTC m=+1060.009461264" observedRunningTime="2025-10-07 17:21:30.558797537 +0000 UTC m=+1094.206209092" watchObservedRunningTime="2025-10-07 17:21:30.564542417 +0000 UTC m=+1094.211953982" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.551041 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerStarted","Data":"6924af7e009ce21c3779524f061005b2d457d3c14b2242e4ae72a1082282a1db"} Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.552802 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.572653 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371954.282145 podStartE2EDuration="1m22.572630634s" podCreationTimestamp="2025-10-07 17:20:09 +0000 UTC" firstStartedPulling="2025-10-07 17:20:11.469533998 +0000 UTC m=+1015.116945553" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:21:31.570244568 +0000 UTC m=+1095.217656123" watchObservedRunningTime="2025-10-07 17:21:31.572630634 +0000 UTC m=+1095.220042189" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.875513 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912574 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912622 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912646 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912673 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912694 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912745 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912762 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912830 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run" (OuterVolumeSpecName: "var-run") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.912836 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkmf2\" (UniqueName: \"kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2\") pod \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\" (UID: \"9f91efa8-3feb-4f2c-837b-c3db92c9f85d\") " Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.913125 4681 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.913136 4681 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.913144 4681 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.913218 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.913674 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts" (OuterVolumeSpecName: "scripts") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:21:31 crc kubenswrapper[4681]: I1007 17:21:31.919078 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2" (OuterVolumeSpecName: "kube-api-access-nkmf2") pod "9f91efa8-3feb-4f2c-837b-c3db92c9f85d" (UID: "9f91efa8-3feb-4f2c-837b-c3db92c9f85d"). InnerVolumeSpecName "kube-api-access-nkmf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.015409 4681 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.015447 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.015507 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkmf2\" (UniqueName: \"kubernetes.io/projected/9f91efa8-3feb-4f2c-837b-c3db92c9f85d-kube-api-access-nkmf2\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.561987 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xhwkc-config-5smwp" event={"ID":"9f91efa8-3feb-4f2c-837b-c3db92c9f85d","Type":"ContainerDied","Data":"75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b"} Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.562275 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75445bf6c2b74589a2e3c8f2c125d39c8970be01d89296411615cb10e21dab0b" Oct 07 17:21:32 crc kubenswrapper[4681]: I1007 17:21:32.562066 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xhwkc-config-5smwp" Oct 07 17:21:33 crc kubenswrapper[4681]: I1007 17:21:33.027057 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xhwkc-config-5smwp"] Oct 07 17:21:33 crc kubenswrapper[4681]: I1007 17:21:33.045713 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xhwkc-config-5smwp"] Oct 07 17:21:33 crc kubenswrapper[4681]: I1007 17:21:33.864521 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xhwkc" Oct 07 17:21:35 crc kubenswrapper[4681]: I1007 17:21:35.038144 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f91efa8-3feb-4f2c-837b-c3db92c9f85d" path="/var/lib/kubelet/pods/9f91efa8-3feb-4f2c-837b-c3db92c9f85d/volumes" Oct 07 17:21:39 crc kubenswrapper[4681]: I1007 17:21:39.331284 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:39 crc kubenswrapper[4681]: I1007 17:21:39.354756 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e111df37-d4f7-4dc5-ad9a-04b05519309a-etc-swift\") pod \"swift-storage-0\" (UID: \"e111df37-d4f7-4dc5-ad9a-04b05519309a\") " pod="openstack/swift-storage-0" Oct 07 17:21:39 crc kubenswrapper[4681]: I1007 17:21:39.402167 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.337149 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.686704 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-d2gsl"] Oct 07 17:21:40 crc kubenswrapper[4681]: E1007 17:21:40.687046 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f91efa8-3feb-4f2c-837b-c3db92c9f85d" containerName="ovn-config" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.687070 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f91efa8-3feb-4f2c-837b-c3db92c9f85d" containerName="ovn-config" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.687281 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f91efa8-3feb-4f2c-837b-c3db92c9f85d" containerName="ovn-config" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.687893 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.715363 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-d2gsl"] Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.765666 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgkq\" (UniqueName: \"kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq\") pod \"cinder-db-create-d2gsl\" (UID: \"2fd1c96e-3ade-4991-a129-9e12fd5837db\") " pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.781252 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-xcdt5"] Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.782205 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.795082 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xcdt5"] Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.867213 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgkq\" (UniqueName: \"kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq\") pod \"cinder-db-create-d2gsl\" (UID: \"2fd1c96e-3ade-4991-a129-9e12fd5837db\") " pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.867334 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmrr\" (UniqueName: \"kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr\") pod \"barbican-db-create-xcdt5\" (UID: \"f5409c34-27ff-4970-a35c-5bb5ee377fb9\") " pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.890453 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgkq\" (UniqueName: \"kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq\") pod \"cinder-db-create-d2gsl\" (UID: \"2fd1c96e-3ade-4991-a129-9e12fd5837db\") " pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.966069 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.967998 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmrr\" (UniqueName: \"kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr\") pod \"barbican-db-create-xcdt5\" (UID: \"f5409c34-27ff-4970-a35c-5bb5ee377fb9\") " pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.969325 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rhm4v"] Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.970423 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:40 crc kubenswrapper[4681]: I1007 17:21:40.982807 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rhm4v"] Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.004782 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.011910 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmrr\" (UniqueName: \"kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr\") pod \"barbican-db-create-xcdt5\" (UID: \"f5409c34-27ff-4970-a35c-5bb5ee377fb9\") " pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.103588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.170971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm59\" (UniqueName: \"kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59\") pod \"neutron-db-create-rhm4v\" (UID: \"6ab816f0-aaab-4889-817f-3cbe0492dfe0\") " pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.232169 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ht4mq"] Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.233148 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.235681 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.236367 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bnt7" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.236542 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.238519 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.247366 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ht4mq"] Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.272558 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm59\" (UniqueName: \"kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59\") pod \"neutron-db-create-rhm4v\" (UID: \"6ab816f0-aaab-4889-817f-3cbe0492dfe0\") " pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.294703 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm59\" (UniqueName: \"kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59\") pod \"neutron-db-create-rhm4v\" (UID: \"6ab816f0-aaab-4889-817f-3cbe0492dfe0\") " pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.373670 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.373813 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9z76\" (UniqueName: \"kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.373864 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.475530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.475622 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9z76\" (UniqueName: \"kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.475669 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.480394 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.495365 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9z76\" (UniqueName: \"kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.495855 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle\") pod \"keystone-db-sync-ht4mq\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.561585 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:41 crc kubenswrapper[4681]: I1007 17:21:41.584905 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.195195 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.195603 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.195659 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.196371 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.196428 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a" gracePeriod=600 Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.644420 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a" exitCode=0 Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.644491 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a"} Oct 07 17:21:42 crc kubenswrapper[4681]: I1007 17:21:42.644550 4681 scope.go:117] "RemoveContainer" containerID="f140a217647ffa9460543862999c01a07af9aa4d5b74d190946e7b3d091b13cf" Oct 07 17:21:45 crc kubenswrapper[4681]: E1007 17:21:45.391284 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 07 17:21:45 crc kubenswrapper[4681]: E1007 17:21:45.391808 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-228bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-pgdhp_openstack(2fd19541-0a38-4bab-bc65-ac2700770ce1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:21:45 crc kubenswrapper[4681]: E1007 17:21:45.396013 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-pgdhp" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" Oct 07 17:21:45 crc kubenswrapper[4681]: I1007 17:21:45.670270 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8"} Oct 07 17:21:45 crc kubenswrapper[4681]: E1007 17:21:45.672084 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-pgdhp" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" Oct 07 17:21:45 crc kubenswrapper[4681]: I1007 17:21:45.906362 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ht4mq"] Oct 07 17:21:45 crc kubenswrapper[4681]: I1007 17:21:45.913314 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-d2gsl"] Oct 07 17:21:45 crc kubenswrapper[4681]: W1007 17:21:45.916344 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd1c96e_3ade_4991_a129_9e12fd5837db.slice/crio-eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37 WatchSource:0}: Error finding container eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37: Status 404 returned error can't find the container with id eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37 Oct 07 17:21:46 crc kubenswrapper[4681]: W1007 17:21:46.033420 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5409c34_27ff_4970_a35c_5bb5ee377fb9.slice/crio-511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a WatchSource:0}: Error finding container 511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a: Status 404 returned error can't find the container with id 511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.033937 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xcdt5"] Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.053141 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rhm4v"] Oct 07 17:21:46 crc kubenswrapper[4681]: W1007 17:21:46.057155 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab816f0_aaab_4889_817f_3cbe0492dfe0.slice/crio-d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f WatchSource:0}: Error finding container d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f: Status 404 returned error can't find the container with id d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.105239 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.681503 4681 generic.go:334] "Generic (PLEG): container finished" podID="6ab816f0-aaab-4889-817f-3cbe0492dfe0" containerID="36747c666167852514cd58d2dc0e07155503cdc9059811705518c93231ea5007" exitCode=0 Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.681611 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhm4v" event={"ID":"6ab816f0-aaab-4889-817f-3cbe0492dfe0","Type":"ContainerDied","Data":"36747c666167852514cd58d2dc0e07155503cdc9059811705518c93231ea5007"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.681841 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhm4v" event={"ID":"6ab816f0-aaab-4889-817f-3cbe0492dfe0","Type":"ContainerStarted","Data":"d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.684247 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"e6bec53d2f2ffef02cfd915b735944aaa93dbcad4d4508cd2832b15698bbe80f"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.685849 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ht4mq" event={"ID":"6347ab39-7d52-4a04-ac0a-3df98268b8fe","Type":"ContainerStarted","Data":"916a9888a318363524377ba04cba7b8e09ef1df15044c2eb55e317f8d497d424"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.688032 4681 generic.go:334] "Generic (PLEG): container finished" podID="2fd1c96e-3ade-4991-a129-9e12fd5837db" containerID="5120d5b99de7792408773647009766d11598ab9469f32c5e7c9ca7fea3cc167a" exitCode=0 Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.688107 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d2gsl" event={"ID":"2fd1c96e-3ade-4991-a129-9e12fd5837db","Type":"ContainerDied","Data":"5120d5b99de7792408773647009766d11598ab9469f32c5e7c9ca7fea3cc167a"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.688149 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d2gsl" event={"ID":"2fd1c96e-3ade-4991-a129-9e12fd5837db","Type":"ContainerStarted","Data":"eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.693477 4681 generic.go:334] "Generic (PLEG): container finished" podID="f5409c34-27ff-4970-a35c-5bb5ee377fb9" containerID="c6c43115eaace82fb286e9fd45284f16d87955e26ead779d3870669b14e55187" exitCode=0 Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.694314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xcdt5" event={"ID":"f5409c34-27ff-4970-a35c-5bb5ee377fb9","Type":"ContainerDied","Data":"c6c43115eaace82fb286e9fd45284f16d87955e26ead779d3870669b14e55187"} Oct 07 17:21:46 crc kubenswrapper[4681]: I1007 17:21:46.694350 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xcdt5" event={"ID":"f5409c34-27ff-4970-a35c-5bb5ee377fb9","Type":"ContainerStarted","Data":"511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a"} Oct 07 17:21:47 crc kubenswrapper[4681]: I1007 17:21:47.701329 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"58916d409a02262ec7ef95f15ea11ec0226ae2cd0ce0aa88b352d08dbbdbc326"} Oct 07 17:21:47 crc kubenswrapper[4681]: I1007 17:21:47.701599 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"d19f9db51b38747ef1812c9751337ae47be8d247d10fa7f94dcd4fc8ec0f1ef9"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.055998 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.072672 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmrr\" (UniqueName: \"kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr\") pod \"f5409c34-27ff-4970-a35c-5bb5ee377fb9\" (UID: \"f5409c34-27ff-4970-a35c-5bb5ee377fb9\") " Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.082004 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr" (OuterVolumeSpecName: "kube-api-access-8wmrr") pod "f5409c34-27ff-4970-a35c-5bb5ee377fb9" (UID: "f5409c34-27ff-4970-a35c-5bb5ee377fb9"). InnerVolumeSpecName "kube-api-access-8wmrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.137043 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.143891 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.176103 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmrr\" (UniqueName: \"kubernetes.io/projected/f5409c34-27ff-4970-a35c-5bb5ee377fb9-kube-api-access-8wmrr\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.277526 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btm59\" (UniqueName: \"kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59\") pod \"6ab816f0-aaab-4889-817f-3cbe0492dfe0\" (UID: \"6ab816f0-aaab-4889-817f-3cbe0492dfe0\") " Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.277623 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwgkq\" (UniqueName: \"kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq\") pod \"2fd1c96e-3ade-4991-a129-9e12fd5837db\" (UID: \"2fd1c96e-3ade-4991-a129-9e12fd5837db\") " Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.283761 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59" (OuterVolumeSpecName: "kube-api-access-btm59") pod "6ab816f0-aaab-4889-817f-3cbe0492dfe0" (UID: "6ab816f0-aaab-4889-817f-3cbe0492dfe0"). InnerVolumeSpecName "kube-api-access-btm59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.283808 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq" (OuterVolumeSpecName: "kube-api-access-pwgkq") pod "2fd1c96e-3ade-4991-a129-9e12fd5837db" (UID: "2fd1c96e-3ade-4991-a129-9e12fd5837db"). InnerVolumeSpecName "kube-api-access-pwgkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.383221 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btm59\" (UniqueName: \"kubernetes.io/projected/6ab816f0-aaab-4889-817f-3cbe0492dfe0-kube-api-access-btm59\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.383267 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwgkq\" (UniqueName: \"kubernetes.io/projected/2fd1c96e-3ade-4991-a129-9e12fd5837db-kube-api-access-pwgkq\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.714020 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xcdt5" event={"ID":"f5409c34-27ff-4970-a35c-5bb5ee377fb9","Type":"ContainerDied","Data":"511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.714059 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511e42981e8345cd85c4498fe571b9663bf2a0734c2afb55514da09c1c4f509a" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.714135 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xcdt5" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.716853 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rhm4v" event={"ID":"6ab816f0-aaab-4889-817f-3cbe0492dfe0","Type":"ContainerDied","Data":"d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.716907 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d98a7c6337bc13438090780242e21184b5f46b9141c9297cc4da446fed9d6a7f" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.716931 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rhm4v" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.725549 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"7380fc84b6ce2ba11dc344451f3076b19f8540ed761b1e8f26db27c776ca2b5b"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.725590 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"119a7f0413ffacd8138a7e190d647e7652dafd3d1972265d575c06cb945152d6"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.727539 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-d2gsl" event={"ID":"2fd1c96e-3ade-4991-a129-9e12fd5837db","Type":"ContainerDied","Data":"eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37"} Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.727563 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-d2gsl" Oct 07 17:21:48 crc kubenswrapper[4681]: I1007 17:21:48.727578 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eec1782b3b22f30ab521ed0c5f0dab764538b1236cbacc46d335b93680c67e37" Oct 07 17:21:51 crc kubenswrapper[4681]: I1007 17:21:51.765647 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"2989efd5ce255bd2ce6f02ceaa3988e1d8be5e429f11a793f53a084897890463"} Oct 07 17:21:51 crc kubenswrapper[4681]: I1007 17:21:51.766383 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"44f8d1cacd1a1e23d5304d127edff046159710882be39dee186e9a324abc213e"} Oct 07 17:21:51 crc kubenswrapper[4681]: I1007 17:21:51.769201 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ht4mq" event={"ID":"6347ab39-7d52-4a04-ac0a-3df98268b8fe","Type":"ContainerStarted","Data":"4b81540b31ab1ead44d3eb742ea3865a8a49c299ce9949336faf0b5850aa3429"} Oct 07 17:21:51 crc kubenswrapper[4681]: I1007 17:21:51.793845 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ht4mq" podStartSLOduration=5.83472826 podStartE2EDuration="10.793825536s" podCreationTimestamp="2025-10-07 17:21:41 +0000 UTC" firstStartedPulling="2025-10-07 17:21:45.916902574 +0000 UTC m=+1109.564314129" lastFinishedPulling="2025-10-07 17:21:50.87599985 +0000 UTC m=+1114.523411405" observedRunningTime="2025-10-07 17:21:51.792490199 +0000 UTC m=+1115.439901774" watchObservedRunningTime="2025-10-07 17:21:51.793825536 +0000 UTC m=+1115.441237092" Oct 07 17:21:52 crc kubenswrapper[4681]: I1007 17:21:52.781315 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"bf36afb6b63b699c79237b8585bde12bbfe0b9901a9ca4fdda64c2a161835f70"} Oct 07 17:21:52 crc kubenswrapper[4681]: I1007 17:21:52.781616 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"d105321501e33546adc6527264bd4138942bedd191d86911806b39de9e358624"} Oct 07 17:21:53 crc kubenswrapper[4681]: I1007 17:21:53.810199 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"66946e247b63ecb31e1b4abcf2e63c3e89ec2c4577289b60052b1e7e61d69e5f"} Oct 07 17:21:53 crc kubenswrapper[4681]: I1007 17:21:53.810497 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"d6f6508f6762457b68d5131b4b3ff6d89875336010443824d4dd029d289969ea"} Oct 07 17:21:53 crc kubenswrapper[4681]: I1007 17:21:53.810509 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"e49e8858c64d2eedbdb38e7ffb6ebd74c425ffb12056adaa2fcf5c762106c85e"} Oct 07 17:21:53 crc kubenswrapper[4681]: I1007 17:21:53.810518 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"a6573929cf51d82e0673854eead6d1cff6191e15605dc769f45f7a8811daba19"} Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.823507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"e846d307181ca148de5b90f11856f3bb179c163389b038e8a09e96ea17e2f130"} Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.823831 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"064e740d6c5af7d4e316e25d678400c35474d6149f9efe3dbe5c8626db769d3f"} Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.823842 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e111df37-d4f7-4dc5-ad9a-04b05519309a","Type":"ContainerStarted","Data":"f925d2a5a743c1c7d0e47e2e4aae418f7fc0bb1e3610c7fcc9f2d5a3c15baa96"} Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.826339 4681 generic.go:334] "Generic (PLEG): container finished" podID="6347ab39-7d52-4a04-ac0a-3df98268b8fe" containerID="4b81540b31ab1ead44d3eb742ea3865a8a49c299ce9949336faf0b5850aa3429" exitCode=0 Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.826399 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ht4mq" event={"ID":"6347ab39-7d52-4a04-ac0a-3df98268b8fe","Type":"ContainerDied","Data":"4b81540b31ab1ead44d3eb742ea3865a8a49c299ce9949336faf0b5850aa3429"} Oct 07 17:21:54 crc kubenswrapper[4681]: I1007 17:21:54.883194 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=41.921881069 podStartE2EDuration="48.883177903s" podCreationTimestamp="2025-10-07 17:21:06 +0000 UTC" firstStartedPulling="2025-10-07 17:21:46.139128554 +0000 UTC m=+1109.786540109" lastFinishedPulling="2025-10-07 17:21:53.100425388 +0000 UTC m=+1116.747836943" observedRunningTime="2025-10-07 17:21:54.857983183 +0000 UTC m=+1118.505394758" watchObservedRunningTime="2025-10-07 17:21:54.883177903 +0000 UTC m=+1118.530589458" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.108477 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:21:55 crc kubenswrapper[4681]: E1007 17:21:55.108864 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab816f0-aaab-4889-817f-3cbe0492dfe0" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.108898 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab816f0-aaab-4889-817f-3cbe0492dfe0" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: E1007 17:21:55.108924 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5409c34-27ff-4970-a35c-5bb5ee377fb9" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.108934 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5409c34-27ff-4970-a35c-5bb5ee377fb9" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: E1007 17:21:55.108956 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd1c96e-3ade-4991-a129-9e12fd5837db" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.108964 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd1c96e-3ade-4991-a129-9e12fd5837db" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.109157 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd1c96e-3ade-4991-a129-9e12fd5837db" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.109179 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5409c34-27ff-4970-a35c-5bb5ee377fb9" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.109204 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab816f0-aaab-4889-817f-3cbe0492dfe0" containerName="mariadb-database-create" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.110285 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.115717 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.125321 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295151 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295206 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7stn\" (UniqueName: \"kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295248 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295308 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295354 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.295429 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.396770 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.397304 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.397343 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.397384 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.397419 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.397449 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7stn\" (UniqueName: \"kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.398289 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.398603 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.398900 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.399436 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.399681 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.415726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7stn\" (UniqueName: \"kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn\") pod \"dnsmasq-dns-5c79d794d7-6brxc\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:55 crc kubenswrapper[4681]: I1007 17:21:55.430731 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:55.897762 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:21:56 crc kubenswrapper[4681]: W1007 17:21:55.916533 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9321d390_e73b_48db_9f87_7e9a5ba5e1fd.slice/crio-b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c WatchSource:0}: Error finding container b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c: Status 404 returned error can't find the container with id b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.571904 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.722400 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle\") pod \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.722588 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9z76\" (UniqueName: \"kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76\") pod \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.722672 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data\") pod \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\" (UID: \"6347ab39-7d52-4a04-ac0a-3df98268b8fe\") " Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.731350 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76" (OuterVolumeSpecName: "kube-api-access-d9z76") pod "6347ab39-7d52-4a04-ac0a-3df98268b8fe" (UID: "6347ab39-7d52-4a04-ac0a-3df98268b8fe"). InnerVolumeSpecName "kube-api-access-d9z76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.749206 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6347ab39-7d52-4a04-ac0a-3df98268b8fe" (UID: "6347ab39-7d52-4a04-ac0a-3df98268b8fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.792843 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data" (OuterVolumeSpecName: "config-data") pod "6347ab39-7d52-4a04-ac0a-3df98268b8fe" (UID: "6347ab39-7d52-4a04-ac0a-3df98268b8fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.825119 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.825157 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6347ab39-7d52-4a04-ac0a-3df98268b8fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.825174 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9z76\" (UniqueName: \"kubernetes.io/projected/6347ab39-7d52-4a04-ac0a-3df98268b8fe-kube-api-access-d9z76\") on node \"crc\" DevicePath \"\"" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.843032 4681 generic.go:334] "Generic (PLEG): container finished" podID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerID="fa32f5f9b62796c460b420238dfac9f3d68c3b05e3a1db483e99dcb79df5857f" exitCode=0 Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.843132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" event={"ID":"9321d390-e73b-48db-9f87-7e9a5ba5e1fd","Type":"ContainerDied","Data":"fa32f5f9b62796c460b420238dfac9f3d68c3b05e3a1db483e99dcb79df5857f"} Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.843193 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" event={"ID":"9321d390-e73b-48db-9f87-7e9a5ba5e1fd","Type":"ContainerStarted","Data":"b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c"} Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.845159 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ht4mq" event={"ID":"6347ab39-7d52-4a04-ac0a-3df98268b8fe","Type":"ContainerDied","Data":"916a9888a318363524377ba04cba7b8e09ef1df15044c2eb55e317f8d497d424"} Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.845188 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916a9888a318363524377ba04cba7b8e09ef1df15044c2eb55e317f8d497d424" Oct 07 17:21:56 crc kubenswrapper[4681]: I1007 17:21:56.845236 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ht4mq" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.144405 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.232454 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:21:57 crc kubenswrapper[4681]: E1007 17:21:57.235593 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6347ab39-7d52-4a04-ac0a-3df98268b8fe" containerName="keystone-db-sync" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.235621 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="6347ab39-7d52-4a04-ac0a-3df98268b8fe" containerName="keystone-db-sync" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.236015 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="6347ab39-7d52-4a04-ac0a-3df98268b8fe" containerName="keystone-db-sync" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.237619 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.247289 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pf79f"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.248371 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.250555 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.250692 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.250825 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.252833 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bnt7" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.263928 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pf79f"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.278338 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.336832 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.336968 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.336997 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khknk\" (UniqueName: \"kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337019 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfc9x\" (UniqueName: \"kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337063 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337098 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337117 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337135 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337154 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337182 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.337199 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.410488 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.413353 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.418119 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.422443 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.424420 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.438314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.438526 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439065 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439191 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439313 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khknk\" (UniqueName: \"kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439401 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfc9x\" (UniqueName: \"kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439486 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439566 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439682 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439790 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439871 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.439975 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.442137 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.442191 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.442674 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.442692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.442919 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.443788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.444394 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.444607 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.450617 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.467499 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.488339 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.489667 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.493501 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.494281 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.494525 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-fthp5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.494768 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.515679 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khknk\" (UniqueName: \"kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk\") pod \"keystone-bootstrap-pf79f\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.526470 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfc9x\" (UniqueName: \"kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x\") pod \"dnsmasq-dns-5b868669f-dd57p\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.530524 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.541711 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcr7v\" (UniqueName: \"kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.541784 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.541862 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.541955 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.542015 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.542034 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.542081 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.572496 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.584279 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644443 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcr7v\" (UniqueName: \"kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644488 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644513 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644544 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644580 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644619 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644636 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644654 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644680 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6pg\" (UniqueName: \"kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644700 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644715 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.644741 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.657503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.657701 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.659189 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.673020 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.673222 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.673605 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.686067 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcr7v\" (UniqueName: \"kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v\") pod \"ceilometer-0\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.721917 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.723513 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.747131 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.747941 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.747185 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.752666 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6pg\" (UniqueName: \"kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.752987 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.753068 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.753625 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.753710 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.754172 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.757161 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.801527 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6pg\" (UniqueName: \"kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg\") pod \"horizon-8459b45747-n55dk\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.830853 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.847612 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.848953 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.862747 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.862831 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkcmv\" (UniqueName: \"kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.862983 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.865093 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.865161 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.872672 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.886999 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" event={"ID":"9321d390-e73b-48db-9f87-7e9a5ba5e1fd","Type":"ContainerStarted","Data":"05d6c467f6709cb3b68fb25926f26eead9e6386c7b661881059cd415af9216c2"} Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.887153 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="dnsmasq-dns" containerID="cri-o://05d6c467f6709cb3b68fb25926f26eead9e6386c7b661881059cd415af9216c2" gracePeriod=10 Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.887431 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.896760 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.916189 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.922709 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" podStartSLOduration=2.922688145 podStartE2EDuration="2.922688145s" podCreationTimestamp="2025-10-07 17:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:21:57.915850705 +0000 UTC m=+1121.563262260" watchObservedRunningTime="2025-10-07 17:21:57.922688145 +0000 UTC m=+1121.570099700" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.958848 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-m2s62"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.961218 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967698 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pvcw8" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967707 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967774 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967805 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967840 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkcmv\" (UniqueName: \"kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967868 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967921 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967942 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.967958 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.968004 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.968041 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.968106 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhnkx\" (UniqueName: \"kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.968130 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.968747 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.969657 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.971299 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.979819 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.981448 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m2s62"] Oct 07 17:21:57 crc kubenswrapper[4681]: I1007 17:21:57.987505 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.000265 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkcmv\" (UniqueName: \"kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv\") pod \"horizon-76c6b58665-pvxbw\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074600 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074662 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vgt\" (UniqueName: \"kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074692 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhnkx\" (UniqueName: \"kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074711 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074729 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074792 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074813 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074833 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074857 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074890 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.074914 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.075755 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.076233 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.076305 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.076866 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.078379 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.094550 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhnkx\" (UniqueName: \"kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx\") pod \"dnsmasq-dns-cf78879c9-bhfl5\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.097707 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.179607 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.179718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.179758 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.179783 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.179829 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vgt\" (UniqueName: \"kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.180392 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.183237 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.185823 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.186100 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.197967 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.202483 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vgt\" (UniqueName: \"kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt\") pod \"placement-db-sync-m2s62\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.290463 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m2s62" Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.381287 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pf79f"] Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.387518 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.584648 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.682192 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.913933 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerStarted","Data":"c62abcd3f0042ec372b93b90b4f0185d8234e7bf0d016b6be567fd5b1f424ee6"} Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.920431 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pf79f" event={"ID":"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b","Type":"ContainerStarted","Data":"40e3ce9772d22d6f847d14707fb3c470004d5221dd4070290da920e1d433129a"} Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.924407 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-dd57p" event={"ID":"10b22a35-0120-4b09-aa29-9e8a8fe6b802","Type":"ContainerStarted","Data":"4638af6993e4944de4634ba15cb38affae38d5309ba813c9b232d1fb2d6eb2d1"} Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.925832 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerStarted","Data":"63fbeffa619773135c369dc51c075283bb13b6d0d75273069f17c11e7701f65e"} Oct 07 17:21:58 crc kubenswrapper[4681]: I1007 17:21:58.952121 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:21:58 crc kubenswrapper[4681]: W1007 17:21:58.989926 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67fd19_2e0a_4afa_a595_fce5c29c3f18.slice/crio-4b225f8e8ee9f70e8221d3d20becacf540be2a5fef729af2c3fcbb925c032045 WatchSource:0}: Error finding container 4b225f8e8ee9f70e8221d3d20becacf540be2a5fef729af2c3fcbb925c032045: Status 404 returned error can't find the container with id 4b225f8e8ee9f70e8221d3d20becacf540be2a5fef729af2c3fcbb925c032045 Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.236940 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-m2s62"] Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.258603 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.954962 4681 generic.go:334] "Generic (PLEG): container finished" podID="10b22a35-0120-4b09-aa29-9e8a8fe6b802" containerID="e5403e8b8c24593b2c473e5c6e15f24304169b53fd7795a2a3c16ecd3236eb9b" exitCode=0 Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.955589 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-dd57p" event={"ID":"10b22a35-0120-4b09-aa29-9e8a8fe6b802","Type":"ContainerDied","Data":"e5403e8b8c24593b2c473e5c6e15f24304169b53fd7795a2a3c16ecd3236eb9b"} Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.964707 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m2s62" event={"ID":"fdb73657-7045-4536-b856-81fcc6da6718","Type":"ContainerStarted","Data":"845b91539d63df41d4b814a0d8bde2491b31caef80995ab7973d2ecb39765e58"} Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.971157 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerStarted","Data":"4b225f8e8ee9f70e8221d3d20becacf540be2a5fef729af2c3fcbb925c032045"} Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.990161 4681 generic.go:334] "Generic (PLEG): container finished" podID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerID="4eb16e791f50e5c3058388e2e1bb9d8e83ceb4e7aebaca2f0cc24b1fa829ca77" exitCode=0 Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.990234 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" event={"ID":"e5f57f38-603f-48d1-9326-8b5183fe99ae","Type":"ContainerDied","Data":"4eb16e791f50e5c3058388e2e1bb9d8e83ceb4e7aebaca2f0cc24b1fa829ca77"} Oct 07 17:21:59 crc kubenswrapper[4681]: I1007 17:21:59.990259 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" event={"ID":"e5f57f38-603f-48d1-9326-8b5183fe99ae","Type":"ContainerStarted","Data":"02bfbf595d3d2df0ac70d82ba0d88fc21be259055301337480332113b56ddbca"} Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.008893 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pf79f" event={"ID":"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b","Type":"ContainerStarted","Data":"589a400fada1fcb5d074a5adabfeb4644cd9cca72461253fb4b784214c38a6fd"} Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.021107 4681 generic.go:334] "Generic (PLEG): container finished" podID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerID="05d6c467f6709cb3b68fb25926f26eead9e6386c7b661881059cd415af9216c2" exitCode=0 Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.021171 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" event={"ID":"9321d390-e73b-48db-9f87-7e9a5ba5e1fd","Type":"ContainerDied","Data":"05d6c467f6709cb3b68fb25926f26eead9e6386c7b661881059cd415af9216c2"} Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.021199 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" event={"ID":"9321d390-e73b-48db-9f87-7e9a5ba5e1fd","Type":"ContainerDied","Data":"b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c"} Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.021208 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5c4683886d23b410a4e1ce94659c9ae6dda3e9348f9dc49de961c7966e4139c" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.035196 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.063262 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pf79f" podStartSLOduration=3.063239065 podStartE2EDuration="3.063239065s" podCreationTimestamp="2025-10-07 17:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:00.057488875 +0000 UTC m=+1123.704900430" watchObservedRunningTime="2025-10-07 17:22:00.063239065 +0000 UTC m=+1123.710650620" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.146675 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.146760 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7stn\" (UniqueName: \"kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.146833 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.146912 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.146984 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.147033 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb\") pod \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\" (UID: \"9321d390-e73b-48db-9f87-7e9a5ba5e1fd\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.176218 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn" (OuterVolumeSpecName: "kube-api-access-r7stn") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "kube-api-access-r7stn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.253895 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7stn\" (UniqueName: \"kubernetes.io/projected/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-kube-api-access-r7stn\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.328549 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.345517 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.361193 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.361224 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.391138 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config" (OuterVolumeSpecName: "config") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.410572 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.423435 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9321d390-e73b-48db-9f87-7e9a5ba5e1fd" (UID: "9321d390-e73b-48db-9f87-7e9a5ba5e1fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.463391 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.463423 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.463432 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321d390-e73b-48db-9f87-7e9a5ba5e1fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.700481 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.770457 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.770514 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.771094 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.771120 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.771193 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfc9x\" (UniqueName: \"kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.771255 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0\") pod \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\" (UID: \"10b22a35-0120-4b09-aa29-9e8a8fe6b802\") " Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.799720 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x" (OuterVolumeSpecName: "kube-api-access-kfc9x") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "kube-api-access-kfc9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.809648 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config" (OuterVolumeSpecName: "config") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.826787 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.839951 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-009d-account-create-cwtbh"] Oct 07 17:22:00 crc kubenswrapper[4681]: E1007 17:22:00.840299 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b22a35-0120-4b09-aa29-9e8a8fe6b802" containerName="init" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.840317 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b22a35-0120-4b09-aa29-9e8a8fe6b802" containerName="init" Oct 07 17:22:00 crc kubenswrapper[4681]: E1007 17:22:00.840338 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="dnsmasq-dns" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.840345 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="dnsmasq-dns" Oct 07 17:22:00 crc kubenswrapper[4681]: E1007 17:22:00.840359 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="init" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.840364 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="init" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.840510 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b22a35-0120-4b09-aa29-9e8a8fe6b802" containerName="init" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.840529 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" containerName="dnsmasq-dns" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.841042 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.850820 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.853916 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.875153 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8hq\" (UniqueName: \"kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq\") pod \"barbican-009d-account-create-cwtbh\" (UID: \"7f485ed7-a13a-4cee-b4ef-8df4e9659394\") " pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.875622 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.875635 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.875644 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfc9x\" (UniqueName: \"kubernetes.io/projected/10b22a35-0120-4b09-aa29-9e8a8fe6b802-kube-api-access-kfc9x\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.875670 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.880409 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.880488 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-009d-account-create-cwtbh"] Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.887248 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10b22a35-0120-4b09-aa29-9e8a8fe6b802" (UID: "10b22a35-0120-4b09-aa29-9e8a8fe6b802"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.977373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8hq\" (UniqueName: \"kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq\") pod \"barbican-009d-account-create-cwtbh\" (UID: \"7f485ed7-a13a-4cee-b4ef-8df4e9659394\") " pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.977459 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:00 crc kubenswrapper[4681]: I1007 17:22:00.977474 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10b22a35-0120-4b09-aa29-9e8a8fe6b802-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.004021 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8hq\" (UniqueName: \"kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq\") pod \"barbican-009d-account-create-cwtbh\" (UID: \"7f485ed7-a13a-4cee-b4ef-8df4e9659394\") " pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.067801 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7418-account-create-2nwdq"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.090192 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-6brxc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.090326 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-dd57p" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094240 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094267 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" event={"ID":"e5f57f38-603f-48d1-9326-8b5183fe99ae","Type":"ContainerStarted","Data":"c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b"} Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094282 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-dd57p" event={"ID":"10b22a35-0120-4b09-aa29-9e8a8fe6b802","Type":"ContainerDied","Data":"4638af6993e4944de4634ba15cb38affae38d5309ba813c9b232d1fb2d6eb2d1"} Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094300 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094320 4681 scope.go:117] "RemoveContainer" containerID="e5403e8b8c24593b2c473e5c6e15f24304169b53fd7795a2a3c16ecd3236eb9b" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.094484 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.095743 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.096263 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.107584 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tfjm\" (UniqueName: \"kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm\") pod \"cinder-7418-account-create-2nwdq\" (UID: \"45ce9368-1192-42cf-bd0d-e0f5a208ea77\") " pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.125193 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7418-account-create-2nwdq"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.148023 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" podStartSLOduration=4.147999247 podStartE2EDuration="4.147999247s" podCreationTimestamp="2025-10-07 17:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:01.115152564 +0000 UTC m=+1124.762564119" watchObservedRunningTime="2025-10-07 17:22:01.147999247 +0000 UTC m=+1124.795410802" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.173745 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.186454 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.199094 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.222140 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tfjm\" (UniqueName: \"kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm\") pod \"cinder-7418-account-create-2nwdq\" (UID: \"45ce9368-1192-42cf-bd0d-e0f5a208ea77\") " pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.291373 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.293441 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tfjm\" (UniqueName: \"kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm\") pod \"cinder-7418-account-create-2nwdq\" (UID: \"45ce9368-1192-42cf-bd0d-e0f5a208ea77\") " pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.318323 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.324121 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.324177 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.324264 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.324282 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.324972 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzxrt\" (UniqueName: \"kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.326160 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-6brxc"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.361798 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.374112 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-234c-account-create-qhmzc"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.375497 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.377737 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.384075 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-dd57p"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.389335 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-234c-account-create-qhmzc"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.427544 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.427925 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.427986 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzxrt\" (UniqueName: \"kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.428027 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.428068 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.428604 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.428608 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.429981 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.438782 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.439370 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.455442 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzxrt\" (UniqueName: \"kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt\") pod \"horizon-6bbd487785-qh8xz\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.530427 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6rv\" (UniqueName: \"kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv\") pod \"neutron-234c-account-create-qhmzc\" (UID: \"05edf8ae-0f11-4fb5-8441-6400e4d49ec1\") " pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.584399 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.631890 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6rv\" (UniqueName: \"kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv\") pod \"neutron-234c-account-create-qhmzc\" (UID: \"05edf8ae-0f11-4fb5-8441-6400e4d49ec1\") " pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.676054 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6rv\" (UniqueName: \"kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv\") pod \"neutron-234c-account-create-qhmzc\" (UID: \"05edf8ae-0f11-4fb5-8441-6400e4d49ec1\") " pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.725555 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.872005 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-009d-account-create-cwtbh"] Oct 07 17:22:01 crc kubenswrapper[4681]: I1007 17:22:01.925637 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7418-account-create-2nwdq"] Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.161750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7418-account-create-2nwdq" event={"ID":"45ce9368-1192-42cf-bd0d-e0f5a208ea77","Type":"ContainerStarted","Data":"548978431b04e5231ab1fb672824e87a3d5165da498d916073cc816a39160e19"} Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.176203 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-009d-account-create-cwtbh" event={"ID":"7f485ed7-a13a-4cee-b4ef-8df4e9659394","Type":"ContainerStarted","Data":"2eb3f9b038ca89652291d8aff7badc09debbc71da288c9d6284505e3da4f54de"} Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.181636 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pgdhp" event={"ID":"2fd19541-0a38-4bab-bc65-ac2700770ce1","Type":"ContainerStarted","Data":"7584cfc8bbc92faa2c408828d41ef6e185457dda821cda29b7f3fc7981bdb801"} Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.237137 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pgdhp" podStartSLOduration=3.6485011910000003 podStartE2EDuration="33.23711646s" podCreationTimestamp="2025-10-07 17:21:29 +0000 UTC" firstStartedPulling="2025-10-07 17:21:30.318543391 +0000 UTC m=+1093.965954946" lastFinishedPulling="2025-10-07 17:21:59.90715866 +0000 UTC m=+1123.554570215" observedRunningTime="2025-10-07 17:22:02.23531515 +0000 UTC m=+1125.882726705" watchObservedRunningTime="2025-10-07 17:22:02.23711646 +0000 UTC m=+1125.884528015" Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.237941 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-009d-account-create-cwtbh" podStartSLOduration=2.237936292 podStartE2EDuration="2.237936292s" podCreationTimestamp="2025-10-07 17:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:02.207456636 +0000 UTC m=+1125.854868191" watchObservedRunningTime="2025-10-07 17:22:02.237936292 +0000 UTC m=+1125.885347847" Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.255332 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-234c-account-create-qhmzc"] Oct 07 17:22:02 crc kubenswrapper[4681]: I1007 17:22:02.322987 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.043441 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b22a35-0120-4b09-aa29-9e8a8fe6b802" path="/var/lib/kubelet/pods/10b22a35-0120-4b09-aa29-9e8a8fe6b802/volumes" Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.044392 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9321d390-e73b-48db-9f87-7e9a5ba5e1fd" path="/var/lib/kubelet/pods/9321d390-e73b-48db-9f87-7e9a5ba5e1fd/volumes" Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.209239 4681 generic.go:334] "Generic (PLEG): container finished" podID="45ce9368-1192-42cf-bd0d-e0f5a208ea77" containerID="522b4b57e4561b39de92c9d31fcf44554d94c5eac7b1ee302818010040107912" exitCode=0 Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.209306 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7418-account-create-2nwdq" event={"ID":"45ce9368-1192-42cf-bd0d-e0f5a208ea77","Type":"ContainerDied","Data":"522b4b57e4561b39de92c9d31fcf44554d94c5eac7b1ee302818010040107912"} Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.214998 4681 generic.go:334] "Generic (PLEG): container finished" podID="7f485ed7-a13a-4cee-b4ef-8df4e9659394" containerID="f9a3bc8cc397dec5ed9a2ce1f025c44880cc5cbaccbb83ab609e58cbe58b6282" exitCode=0 Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.215068 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-009d-account-create-cwtbh" event={"ID":"7f485ed7-a13a-4cee-b4ef-8df4e9659394","Type":"ContainerDied","Data":"f9a3bc8cc397dec5ed9a2ce1f025c44880cc5cbaccbb83ab609e58cbe58b6282"} Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.220631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerStarted","Data":"8af6da5c7db8e532e40eef92fc9a9e31de3a4bbd57d1b2e32ed7ff0d43e5aa2b"} Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.226429 4681 generic.go:334] "Generic (PLEG): container finished" podID="05edf8ae-0f11-4fb5-8441-6400e4d49ec1" containerID="d1dba4d4bb0773cc6a80a0a008e4e7301ebf9bef5c0fd52a2ecadeb06f3d90e6" exitCode=0 Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.226472 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-234c-account-create-qhmzc" event={"ID":"05edf8ae-0f11-4fb5-8441-6400e4d49ec1","Type":"ContainerDied","Data":"d1dba4d4bb0773cc6a80a0a008e4e7301ebf9bef5c0fd52a2ecadeb06f3d90e6"} Oct 07 17:22:03 crc kubenswrapper[4681]: I1007 17:22:03.226495 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-234c-account-create-qhmzc" event={"ID":"05edf8ae-0f11-4fb5-8441-6400e4d49ec1","Type":"ContainerStarted","Data":"43433f0cf55318fd29cdaa71b70a70e195b886a0096b83b9bcd2473377b18ec1"} Oct 07 17:22:06 crc kubenswrapper[4681]: I1007 17:22:06.267362 4681 generic.go:334] "Generic (PLEG): container finished" podID="e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" containerID="589a400fada1fcb5d074a5adabfeb4644cd9cca72461253fb4b784214c38a6fd" exitCode=0 Oct 07 17:22:06 crc kubenswrapper[4681]: I1007 17:22:06.267785 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pf79f" event={"ID":"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b","Type":"ContainerDied","Data":"589a400fada1fcb5d074a5adabfeb4644cd9cca72461253fb4b784214c38a6fd"} Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.067259 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.113329 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.123791 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.130021 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.138537 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.239996 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.292087 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f945f854d-hm49c"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.293436 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298332 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298375 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84qg\" (UniqueName: \"kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298421 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298556 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298593 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298726 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.298739 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f945f854d-hm49c"] Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400092 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400142 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-secret-key\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400181 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-config-data\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400199 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-combined-ca-bundle\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400226 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2phm\" (UniqueName: \"kubernetes.io/projected/02a91326-9285-4589-a05b-c0a2c2ed397e-kube-api-access-w2phm\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400291 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a91326-9285-4589-a05b-c0a2c2ed397e-logs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84qg\" (UniqueName: \"kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400411 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-scripts\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400435 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-tls-certs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.400469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.410246 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.410575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.411349 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.411569 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.412066 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.415517 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.433106 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84qg\" (UniqueName: \"kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg\") pod \"horizon-64677bd694-6xgb2\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.440481 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505440 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-scripts\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505771 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-tls-certs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505828 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-secret-key\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505910 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-config-data\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505932 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-combined-ca-bundle\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.505959 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2phm\" (UniqueName: \"kubernetes.io/projected/02a91326-9285-4589-a05b-c0a2c2ed397e-kube-api-access-w2phm\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.506000 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a91326-9285-4589-a05b-c0a2c2ed397e-logs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.506471 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02a91326-9285-4589-a05b-c0a2c2ed397e-logs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.507050 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-scripts\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.511364 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02a91326-9285-4589-a05b-c0a2c2ed397e-config-data\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.511420 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-tls-certs\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.512832 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-horizon-secret-key\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.521581 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a91326-9285-4589-a05b-c0a2c2ed397e-combined-ca-bundle\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.532820 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2phm\" (UniqueName: \"kubernetes.io/projected/02a91326-9285-4589-a05b-c0a2c2ed397e-kube-api-access-w2phm\") pod \"horizon-f945f854d-hm49c\" (UID: \"02a91326-9285-4589-a05b-c0a2c2ed397e\") " pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:07 crc kubenswrapper[4681]: I1007 17:22:07.617003 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:08 crc kubenswrapper[4681]: I1007 17:22:08.200062 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:22:08 crc kubenswrapper[4681]: I1007 17:22:08.259897 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:22:08 crc kubenswrapper[4681]: I1007 17:22:08.260381 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="dnsmasq-dns" containerID="cri-o://875624b1bccd2198c9b96d0c8953e0d10435738f346d156531023242e1ad19c0" gracePeriod=10 Oct 07 17:22:09 crc kubenswrapper[4681]: I1007 17:22:09.304761 4681 generic.go:334] "Generic (PLEG): container finished" podID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerID="875624b1bccd2198c9b96d0c8953e0d10435738f346d156531023242e1ad19c0" exitCode=0 Oct 07 17:22:09 crc kubenswrapper[4681]: I1007 17:22:09.304805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" event={"ID":"edde6df1-fefc-48ee-b81a-638a564a6e18","Type":"ContainerDied","Data":"875624b1bccd2198c9b96d0c8953e0d10435738f346d156531023242e1ad19c0"} Oct 07 17:22:10 crc kubenswrapper[4681]: I1007 17:22:10.313722 4681 generic.go:334] "Generic (PLEG): container finished" podID="2fd19541-0a38-4bab-bc65-ac2700770ce1" containerID="7584cfc8bbc92faa2c408828d41ef6e185457dda821cda29b7f3fc7981bdb801" exitCode=0 Oct 07 17:22:10 crc kubenswrapper[4681]: I1007 17:22:10.314000 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pgdhp" event={"ID":"2fd19541-0a38-4bab-bc65-ac2700770ce1","Type":"ContainerDied","Data":"7584cfc8bbc92faa2c408828d41ef6e185457dda821cda29b7f3fc7981bdb801"} Oct 07 17:22:11 crc kubenswrapper[4681]: I1007 17:22:11.526822 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.000912 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.004185 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.023431 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tfjm\" (UniqueName: \"kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm\") pod \"45ce9368-1192-42cf-bd0d-e0f5a208ea77\" (UID: \"45ce9368-1192-42cf-bd0d-e0f5a208ea77\") " Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.023554 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt6rv\" (UniqueName: \"kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv\") pod \"05edf8ae-0f11-4fb5-8441-6400e4d49ec1\" (UID: \"05edf8ae-0f11-4fb5-8441-6400e4d49ec1\") " Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.050574 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv" (OuterVolumeSpecName: "kube-api-access-nt6rv") pod "05edf8ae-0f11-4fb5-8441-6400e4d49ec1" (UID: "05edf8ae-0f11-4fb5-8441-6400e4d49ec1"). InnerVolumeSpecName "kube-api-access-nt6rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.050966 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm" (OuterVolumeSpecName: "kube-api-access-5tfjm") pod "45ce9368-1192-42cf-bd0d-e0f5a208ea77" (UID: "45ce9368-1192-42cf-bd0d-e0f5a208ea77"). InnerVolumeSpecName "kube-api-access-5tfjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.128188 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tfjm\" (UniqueName: \"kubernetes.io/projected/45ce9368-1192-42cf-bd0d-e0f5a208ea77-kube-api-access-5tfjm\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.128250 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt6rv\" (UniqueName: \"kubernetes.io/projected/05edf8ae-0f11-4fb5-8441-6400e4d49ec1-kube-api-access-nt6rv\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.349847 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7418-account-create-2nwdq" event={"ID":"45ce9368-1192-42cf-bd0d-e0f5a208ea77","Type":"ContainerDied","Data":"548978431b04e5231ab1fb672824e87a3d5165da498d916073cc816a39160e19"} Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.350223 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548978431b04e5231ab1fb672824e87a3d5165da498d916073cc816a39160e19" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.350300 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7418-account-create-2nwdq" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.352774 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-234c-account-create-qhmzc" event={"ID":"05edf8ae-0f11-4fb5-8441-6400e4d49ec1","Type":"ContainerDied","Data":"43433f0cf55318fd29cdaa71b70a70e195b886a0096b83b9bcd2473377b18ec1"} Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.352843 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43433f0cf55318fd29cdaa71b70a70e195b886a0096b83b9bcd2473377b18ec1" Oct 07 17:22:13 crc kubenswrapper[4681]: I1007 17:22:13.352987 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-234c-account-create-qhmzc" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.021600 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.023534 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pgdhp" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.038323 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.094488 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184275 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184338 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184356 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184375 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184401 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle\") pod \"2fd19541-0a38-4bab-bc65-ac2700770ce1\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184424 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184454 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t8hq\" (UniqueName: \"kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq\") pod \"7f485ed7-a13a-4cee-b4ef-8df4e9659394\" (UID: \"7f485ed7-a13a-4cee-b4ef-8df4e9659394\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184500 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data\") pod \"2fd19541-0a38-4bab-bc65-ac2700770ce1\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184608 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-228bv\" (UniqueName: \"kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv\") pod \"2fd19541-0a38-4bab-bc65-ac2700770ce1\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184655 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data\") pod \"2fd19541-0a38-4bab-bc65-ac2700770ce1\" (UID: \"2fd19541-0a38-4bab-bc65-ac2700770ce1\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.184737 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khknk\" (UniqueName: \"kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk\") pod \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\" (UID: \"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.204737 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts" (OuterVolumeSpecName: "scripts") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.210577 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.211071 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq" (OuterVolumeSpecName: "kube-api-access-9t8hq") pod "7f485ed7-a13a-4cee-b4ef-8df4e9659394" (UID: "7f485ed7-a13a-4cee-b4ef-8df4e9659394"). InnerVolumeSpecName "kube-api-access-9t8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.213262 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2fd19541-0a38-4bab-bc65-ac2700770ce1" (UID: "2fd19541-0a38-4bab-bc65-ac2700770ce1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.213316 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk" (OuterVolumeSpecName: "kube-api-access-khknk") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "kube-api-access-khknk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.254189 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv" (OuterVolumeSpecName: "kube-api-access-228bv") pod "2fd19541-0a38-4bab-bc65-ac2700770ce1" (UID: "2fd19541-0a38-4bab-bc65-ac2700770ce1"). InnerVolumeSpecName "kube-api-access-228bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.254289 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.262935 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f945f854d-hm49c"] Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293190 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb\") pod \"edde6df1-fefc-48ee-b81a-638a564a6e18\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293257 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config\") pod \"edde6df1-fefc-48ee-b81a-638a564a6e18\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293281 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc\") pod \"edde6df1-fefc-48ee-b81a-638a564a6e18\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293515 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb\") pod \"edde6df1-fefc-48ee-b81a-638a564a6e18\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293561 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2dtq\" (UniqueName: \"kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq\") pod \"edde6df1-fefc-48ee-b81a-638a564a6e18\" (UID: \"edde6df1-fefc-48ee-b81a-638a564a6e18\") " Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293901 4681 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293912 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khknk\" (UniqueName: \"kubernetes.io/projected/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-kube-api-access-khknk\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293923 4681 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293932 4681 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293941 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293950 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t8hq\" (UniqueName: \"kubernetes.io/projected/7f485ed7-a13a-4cee-b4ef-8df4e9659394-kube-api-access-9t8hq\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.293958 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-228bv\" (UniqueName: \"kubernetes.io/projected/2fd19541-0a38-4bab-bc65-ac2700770ce1-kube-api-access-228bv\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.303935 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq" (OuterVolumeSpecName: "kube-api-access-p2dtq") pod "edde6df1-fefc-48ee-b81a-638a564a6e18" (UID: "edde6df1-fefc-48ee-b81a-638a564a6e18"). InnerVolumeSpecName "kube-api-access-p2dtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.304148 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data" (OuterVolumeSpecName: "config-data") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.365169 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" (UID: "e4a8c402-ed2a-431a-8bc6-1eafbf0f391b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.389839 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-009d-account-create-cwtbh" event={"ID":"7f485ed7-a13a-4cee-b4ef-8df4e9659394","Type":"ContainerDied","Data":"2eb3f9b038ca89652291d8aff7badc09debbc71da288c9d6284505e3da4f54de"} Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.389922 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb3f9b038ca89652291d8aff7badc09debbc71da288c9d6284505e3da4f54de" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.390014 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-009d-account-create-cwtbh" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.397057 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.397095 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.397110 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2dtq\" (UniqueName: \"kubernetes.io/projected/edde6df1-fefc-48ee-b81a-638a564a6e18-kube-api-access-p2dtq\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.409241 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pgdhp" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.409343 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pgdhp" event={"ID":"2fd19541-0a38-4bab-bc65-ac2700770ce1","Type":"ContainerDied","Data":"d6c0a8126b87ce7cd20e510e101d9c0238379008553cb1b814cf598df18df3e5"} Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.410359 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c0a8126b87ce7cd20e510e101d9c0238379008553cb1b814cf598df18df3e5" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.417397 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pf79f" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.417693 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pf79f" event={"ID":"e4a8c402-ed2a-431a-8bc6-1eafbf0f391b","Type":"ContainerDied","Data":"40e3ce9772d22d6f847d14707fb3c470004d5221dd4070290da920e1d433129a"} Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.417735 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e3ce9772d22d6f847d14707fb3c470004d5221dd4070290da920e1d433129a" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.423468 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fd19541-0a38-4bab-bc65-ac2700770ce1" (UID: "2fd19541-0a38-4bab-bc65-ac2700770ce1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.424214 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" event={"ID":"edde6df1-fefc-48ee-b81a-638a564a6e18","Type":"ContainerDied","Data":"d7f8630e38d462f229763a7f7c3e4d885fe9547b7ce88f98e57b00f0623b58a1"} Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.424447 4681 scope.go:117] "RemoveContainer" containerID="875624b1bccd2198c9b96d0c8953e0d10435738f346d156531023242e1ad19c0" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.424408 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-mjmjw" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.430999 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerStarted","Data":"709d87e2be10485849289c240677cf4f92ecd9d56b4dace501947f8c1d37d9f2"} Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.468347 4681 scope.go:117] "RemoveContainer" containerID="9772760a33fb2cf57ae6cb84e66adf5abc975b17eca4478318dc5ffca81a0c0c" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.493234 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.499342 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.628820 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data" (OuterVolumeSpecName: "config-data") pod "2fd19541-0a38-4bab-bc65-ac2700770ce1" (UID: "2fd19541-0a38-4bab-bc65-ac2700770ce1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.679033 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "edde6df1-fefc-48ee-b81a-638a564a6e18" (UID: "edde6df1-fefc-48ee-b81a-638a564a6e18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.686177 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "edde6df1-fefc-48ee-b81a-638a564a6e18" (UID: "edde6df1-fefc-48ee-b81a-638a564a6e18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.703988 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.704126 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd19541-0a38-4bab-bc65-ac2700770ce1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.704199 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.704780 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config" (OuterVolumeSpecName: "config") pod "edde6df1-fefc-48ee-b81a-638a564a6e18" (UID: "edde6df1-fefc-48ee-b81a-638a564a6e18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.746404 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "edde6df1-fefc-48ee-b81a-638a564a6e18" (UID: "edde6df1-fefc-48ee-b81a-638a564a6e18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.807220 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:15 crc kubenswrapper[4681]: I1007 17:22:15.807262 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edde6df1-fefc-48ee-b81a-638a564a6e18-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.063399 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.087104 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-mjmjw"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.267208 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pf79f"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.293427 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pf79f"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.339947 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k2qcd"] Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340349 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" containerName="keystone-bootstrap" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340361 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" containerName="keystone-bootstrap" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340374 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce9368-1192-42cf-bd0d-e0f5a208ea77" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340380 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce9368-1192-42cf-bd0d-e0f5a208ea77" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340391 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="dnsmasq-dns" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340397 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="dnsmasq-dns" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340412 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f485ed7-a13a-4cee-b4ef-8df4e9659394" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340419 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f485ed7-a13a-4cee-b4ef-8df4e9659394" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340438 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05edf8ae-0f11-4fb5-8441-6400e4d49ec1" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340445 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="05edf8ae-0f11-4fb5-8441-6400e4d49ec1" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340459 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" containerName="glance-db-sync" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340467 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" containerName="glance-db-sync" Oct 07 17:22:16 crc kubenswrapper[4681]: E1007 17:22:16.340481 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="init" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340488 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="init" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340658 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" containerName="glance-db-sync" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340672 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" containerName="keystone-bootstrap" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340685 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" containerName="dnsmasq-dns" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340693 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f485ed7-a13a-4cee-b4ef-8df4e9659394" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340707 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="05edf8ae-0f11-4fb5-8441-6400e4d49ec1" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.340719 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ce9368-1192-42cf-bd0d-e0f5a208ea77" containerName="mariadb-account-create" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.341323 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.348509 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.348809 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bnt7" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.348942 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.349437 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.362193 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2qcd"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421578 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421637 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421670 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421701 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421728 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxl5d\" (UniqueName: \"kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.421755 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.462375 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerStarted","Data":"f12687af7e3841ca2a53c32deb4a7158e2c0c873f8ce45fbe4d823c0abd5a391"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.462733 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerStarted","Data":"a4a63533711ff63ac127f62f10923e4573a91a5f48356a8d8ac2c8ae6c22c5bf"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.469217 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerStarted","Data":"8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.469267 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerStarted","Data":"13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.469410 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76c6b58665-pvxbw" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon-log" containerID="cri-o://13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.469975 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76c6b58665-pvxbw" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon" containerID="cri-o://8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.483933 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerStarted","Data":"a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.483981 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerStarted","Data":"71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.484129 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8459b45747-n55dk" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon-log" containerID="cri-o://a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.484457 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8459b45747-n55dk" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon" containerID="cri-o://71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.491811 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerStarted","Data":"e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.491898 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerStarted","Data":"0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.492067 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6bbd487785-qh8xz" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon-log" containerID="cri-o://e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.492182 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6bbd487785-qh8xz" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon" containerID="cri-o://0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913" gracePeriod=30 Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.507824 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerStarted","Data":"9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.507868 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerStarted","Data":"535179f5a0b592940727592cac0bafaeefbfa03b138dbee0420b63178219c6e6"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.521867 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m2s62" event={"ID":"fdb73657-7045-4536-b856-81fcc6da6718","Type":"ContainerStarted","Data":"5580871f26d9f7dcc73fee042cbdd5c2de607e9b709a34fd42a3aa1cdc023ad1"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.523434 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.523511 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.523631 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxl5d\" (UniqueName: \"kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.523670 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.524092 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.524164 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.537298 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.541594 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.542424 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerStarted","Data":"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85"} Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.545010 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.547569 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.555595 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.561624 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x9lv2"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.562663 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.568424 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.568649 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.568761 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tnpq2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.569942 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76c6b58665-pvxbw" podStartSLOduration=3.635774542 podStartE2EDuration="19.569922118s" podCreationTimestamp="2025-10-07 17:21:57 +0000 UTC" firstStartedPulling="2025-10-07 17:21:58.995925987 +0000 UTC m=+1122.643337542" lastFinishedPulling="2025-10-07 17:22:14.930073563 +0000 UTC m=+1138.577485118" observedRunningTime="2025-10-07 17:22:16.528639722 +0000 UTC m=+1140.176051267" watchObservedRunningTime="2025-10-07 17:22:16.569922118 +0000 UTC m=+1140.217333663" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.602636 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxl5d\" (UniqueName: \"kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d\") pod \"keystone-bootstrap-k2qcd\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.607069 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8459b45747-n55dk" podStartSLOduration=3.337319905 podStartE2EDuration="19.607049579s" podCreationTimestamp="2025-10-07 17:21:57 +0000 UTC" firstStartedPulling="2025-10-07 17:21:58.638465871 +0000 UTC m=+1122.285877426" lastFinishedPulling="2025-10-07 17:22:14.908195545 +0000 UTC m=+1138.555607100" observedRunningTime="2025-10-07 17:22:16.595777626 +0000 UTC m=+1140.243189201" watchObservedRunningTime="2025-10-07 17:22:16.607049579 +0000 UTC m=+1140.254461134" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.613385 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x9lv2"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.627804 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.627864 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.628042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.628141 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qftrr\" (UniqueName: \"kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.628278 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.628308 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.661909 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6bbd487785-qh8xz" podStartSLOduration=3.123125291 podStartE2EDuration="15.661866862s" podCreationTimestamp="2025-10-07 17:22:01 +0000 UTC" firstStartedPulling="2025-10-07 17:22:02.374796263 +0000 UTC m=+1126.022207818" lastFinishedPulling="2025-10-07 17:22:14.913537834 +0000 UTC m=+1138.560949389" observedRunningTime="2025-10-07 17:22:16.656830312 +0000 UTC m=+1140.304241867" watchObservedRunningTime="2025-10-07 17:22:16.661866862 +0000 UTC m=+1140.309278407" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.675320 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.737984 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qftrr\" (UniqueName: \"kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.739918 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.739991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.740061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.740098 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.740264 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.743983 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.747124 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ccnch"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.767107 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.789672 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qftrr\" (UniqueName: \"kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.790264 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.791318 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-m2s62" podStartSLOduration=4.122466538 podStartE2EDuration="19.791292236s" podCreationTimestamp="2025-10-07 17:21:57 +0000 UTC" firstStartedPulling="2025-10-07 17:21:59.260836393 +0000 UTC m=+1122.908247948" lastFinishedPulling="2025-10-07 17:22:14.929662091 +0000 UTC m=+1138.577073646" observedRunningTime="2025-10-07 17:22:16.693631864 +0000 UTC m=+1140.341043429" watchObservedRunningTime="2025-10-07 17:22:16.791292236 +0000 UTC m=+1140.438703801" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.800978 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.801548 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle\") pod \"cinder-db-sync-x9lv2\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.817500 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.833030 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dlsjp" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.833367 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.833509 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.874365 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.874419 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.874696 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nh6q\" (UniqueName: \"kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.925056 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ccnch"] Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.944491 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.964788 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f945f854d-hm49c" podStartSLOduration=9.964763603 podStartE2EDuration="9.964763603s" podCreationTimestamp="2025-10-07 17:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:16.746655746 +0000 UTC m=+1140.394067321" watchObservedRunningTime="2025-10-07 17:22:16.964763603 +0000 UTC m=+1140.612175158" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.976597 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.976644 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.976736 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nh6q\" (UniqueName: \"kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:16 crc kubenswrapper[4681]: I1007 17:22:16.984347 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.000772 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.002667 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.017410 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nh6q\" (UniqueName: \"kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q\") pod \"neutron-db-sync-ccnch\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.147555 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a8c402-ed2a-431a-8bc6-1eafbf0f391b" path="/var/lib/kubelet/pods/e4a8c402-ed2a-431a-8bc6-1eafbf0f391b/volumes" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.148713 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edde6df1-fefc-48ee-b81a-638a564a6e18" path="/var/lib/kubelet/pods/edde6df1-fefc-48ee-b81a-638a564a6e18/volumes" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.149431 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.151276 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.151373 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192085 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192159 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192267 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192428 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192615 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.192651 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmcx\" (UniqueName: \"kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.224511 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dlsjp" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.232191 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ccnch" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.297592 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.297646 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.297779 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.298049 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.298182 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.298205 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmcx\" (UniqueName: \"kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.299301 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.299654 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.304119 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.304692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.310763 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.334569 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmcx\" (UniqueName: \"kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx\") pod \"dnsmasq-dns-56df8fb6b7-49r7s\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.522587 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.620668 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.620737 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.648391 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k2qcd"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.679616 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.681754 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.686146 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4g6lg" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.687505 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.689231 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.710561 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.712587 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcv4\" (UniqueName: \"kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.712736 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.712948 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.713068 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.713199 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.713394 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.722588 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.787353 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x9lv2"] Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814567 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814643 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814666 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814701 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814750 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814788 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.814815 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcv4\" (UniqueName: \"kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.815363 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.815802 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.815939 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.829633 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.842484 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.850393 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcv4\" (UniqueName: \"kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.853641 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.916895 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:22:17 crc kubenswrapper[4681]: I1007 17:22:17.923091 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.019117 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.020028 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ccnch"] Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.039598 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.043031 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.047101 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.099462 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.114401 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.228439 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.229737 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.229907 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4wv\" (UniqueName: \"kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.230023 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.230112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.230352 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.230455 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336354 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336714 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4wv\" (UniqueName: \"kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336743 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336774 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336860 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.336995 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.337585 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.337597 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.338083 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.345554 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.350177 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.359751 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.375724 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4wv\" (UniqueName: \"kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.395829 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.407863 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.582763 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.650546 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ccnch" event={"ID":"236dd612-86c8-413b-8ec4-c0f2a55fbf9a","Type":"ContainerStarted","Data":"ad9d6ed8f40df2bea9b382369af9f8819e7608e44aae80308a5a7dbea2304339"} Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.663428 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2qcd" event={"ID":"9405f877-b9a6-4d64-92f1-df500e73046f","Type":"ContainerStarted","Data":"cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e"} Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.663536 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2qcd" event={"ID":"9405f877-b9a6-4d64-92f1-df500e73046f","Type":"ContainerStarted","Data":"e40b54390b87280b0f67a24dc80b8d822fc60bd3344103cc599a5c98d7c357d8"} Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.698279 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerStarted","Data":"a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b"} Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.698370 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k2qcd" podStartSLOduration=2.698351682 podStartE2EDuration="2.698351682s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:18.695432211 +0000 UTC m=+1142.342843776" watchObservedRunningTime="2025-10-07 17:22:18.698351682 +0000 UTC m=+1142.345763237" Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.711864 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x9lv2" event={"ID":"a53e8384-cd97-4cec-ae70-918f86112a99","Type":"ContainerStarted","Data":"01f1a0b3e7aa451a2183604f240199c72451313a062ce872059da116669803c3"} Oct 07 17:22:18 crc kubenswrapper[4681]: I1007 17:22:18.728406 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64677bd694-6xgb2" podStartSLOduration=11.728381825 podStartE2EDuration="11.728381825s" podCreationTimestamp="2025-10-07 17:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:18.723172801 +0000 UTC m=+1142.370584356" watchObservedRunningTime="2025-10-07 17:22:18.728381825 +0000 UTC m=+1142.375793380" Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.006018 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.391948 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.738911 4681 generic.go:334] "Generic (PLEG): container finished" podID="a224f056-2957-405d-a927-4a1b24b01979" containerID="423b4e74661fdafbd104dab0491c59e1721cee46a46b8a99177f3567d0141b41" exitCode=0 Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.738998 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" event={"ID":"a224f056-2957-405d-a927-4a1b24b01979","Type":"ContainerDied","Data":"423b4e74661fdafbd104dab0491c59e1721cee46a46b8a99177f3567d0141b41"} Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.739050 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" event={"ID":"a224f056-2957-405d-a927-4a1b24b01979","Type":"ContainerStarted","Data":"5b2b7510d06476133be26e9ce796eb82eb0d5d96bdf5fa2ed96121f2cc9c7ca9"} Oct 07 17:22:19 crc kubenswrapper[4681]: I1007 17:22:19.743682 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ccnch" event={"ID":"236dd612-86c8-413b-8ec4-c0f2a55fbf9a","Type":"ContainerStarted","Data":"bd9eeaa1933fc5b81841139e6a9f5b6cde0d6a5f14c328f1c1c7a60d9d0d0f73"} Oct 07 17:22:20 crc kubenswrapper[4681]: I1007 17:22:20.106161 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ccnch" podStartSLOduration=4.106141653 podStartE2EDuration="4.106141653s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:19.788935565 +0000 UTC m=+1143.436347120" watchObservedRunningTime="2025-10-07 17:22:20.106141653 +0000 UTC m=+1143.753553208" Oct 07 17:22:20 crc kubenswrapper[4681]: I1007 17:22:20.106542 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:20 crc kubenswrapper[4681]: I1007 17:22:20.205956 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:20 crc kubenswrapper[4681]: I1007 17:22:20.784913 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerStarted","Data":"fb26b1cddade9bd34af28fb8d3f6355abc77718060f74d2fe71029bcc0f950e9"} Oct 07 17:22:20 crc kubenswrapper[4681]: I1007 17:22:20.788005 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerStarted","Data":"914493f6543cc0ee1c62c1780178ca82c5021ae085f2646e1289c8526f82a2ce"} Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.162514 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lvsnj"] Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.163559 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.166865 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.167071 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hhdkv" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.178546 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lvsnj"] Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.261147 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.261198 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghfx\" (UniqueName: \"kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.261226 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.364198 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.364259 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghfx\" (UniqueName: \"kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.364302 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.377414 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.398445 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghfx\" (UniqueName: \"kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.399339 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle\") pod \"barbican-db-sync-lvsnj\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.512716 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.584506 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.800387 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" event={"ID":"a224f056-2957-405d-a927-4a1b24b01979","Type":"ContainerStarted","Data":"1b2775109d58b27cc1e8dba7d1e0d90a73552d7d8cb8527b108f90e4330d5a45"} Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.801719 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.805151 4681 generic.go:334] "Generic (PLEG): container finished" podID="fdb73657-7045-4536-b856-81fcc6da6718" containerID="5580871f26d9f7dcc73fee042cbdd5c2de607e9b709a34fd42a3aa1cdc023ad1" exitCode=0 Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.805193 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m2s62" event={"ID":"fdb73657-7045-4536-b856-81fcc6da6718","Type":"ContainerDied","Data":"5580871f26d9f7dcc73fee042cbdd5c2de607e9b709a34fd42a3aa1cdc023ad1"} Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.822510 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" podStartSLOduration=5.8224923440000005 podStartE2EDuration="5.822492344s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:21.818694759 +0000 UTC m=+1145.466106304" watchObservedRunningTime="2025-10-07 17:22:21.822492344 +0000 UTC m=+1145.469903899" Oct 07 17:22:21 crc kubenswrapper[4681]: I1007 17:22:21.828418 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerStarted","Data":"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c"} Oct 07 17:22:22 crc kubenswrapper[4681]: I1007 17:22:22.109245 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lvsnj"] Oct 07 17:22:22 crc kubenswrapper[4681]: I1007 17:22:22.859286 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerStarted","Data":"36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29"} Oct 07 17:22:22 crc kubenswrapper[4681]: I1007 17:22:22.872116 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerStarted","Data":"a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032"} Oct 07 17:22:22 crc kubenswrapper[4681]: I1007 17:22:22.876361 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lvsnj" event={"ID":"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6","Type":"ContainerStarted","Data":"2e2c30d6b1d112ab75eabc9816ab2c1c0d18f88dae97fdde394b629eee3f10b7"} Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.491504 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m2s62" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.631732 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle\") pod \"fdb73657-7045-4536-b856-81fcc6da6718\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.631846 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data\") pod \"fdb73657-7045-4536-b856-81fcc6da6718\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.631933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs\") pod \"fdb73657-7045-4536-b856-81fcc6da6718\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.631969 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vgt\" (UniqueName: \"kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt\") pod \"fdb73657-7045-4536-b856-81fcc6da6718\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.632042 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts\") pod \"fdb73657-7045-4536-b856-81fcc6da6718\" (UID: \"fdb73657-7045-4536-b856-81fcc6da6718\") " Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.638211 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs" (OuterVolumeSpecName: "logs") pod "fdb73657-7045-4536-b856-81fcc6da6718" (UID: "fdb73657-7045-4536-b856-81fcc6da6718"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.649060 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts" (OuterVolumeSpecName: "scripts") pod "fdb73657-7045-4536-b856-81fcc6da6718" (UID: "fdb73657-7045-4536-b856-81fcc6da6718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.656080 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt" (OuterVolumeSpecName: "kube-api-access-v9vgt") pod "fdb73657-7045-4536-b856-81fcc6da6718" (UID: "fdb73657-7045-4536-b856-81fcc6da6718"). InnerVolumeSpecName "kube-api-access-v9vgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.705189 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data" (OuterVolumeSpecName: "config-data") pod "fdb73657-7045-4536-b856-81fcc6da6718" (UID: "fdb73657-7045-4536-b856-81fcc6da6718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.708100 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdb73657-7045-4536-b856-81fcc6da6718" (UID: "fdb73657-7045-4536-b856-81fcc6da6718"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.734905 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.734941 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.734952 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb73657-7045-4536-b856-81fcc6da6718-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.734962 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdb73657-7045-4536-b856-81fcc6da6718-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.734972 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vgt\" (UniqueName: \"kubernetes.io/projected/fdb73657-7045-4536-b856-81fcc6da6718-kube-api-access-v9vgt\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.935183 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68578dd4f6-bzx29"] Oct 07 17:22:23 crc kubenswrapper[4681]: E1007 17:22:23.935797 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb73657-7045-4536-b856-81fcc6da6718" containerName="placement-db-sync" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.935810 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb73657-7045-4536-b856-81fcc6da6718" containerName="placement-db-sync" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.936023 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb73657-7045-4536-b856-81fcc6da6718" containerName="placement-db-sync" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.946036 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-m2s62" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.964194 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68578dd4f6-bzx29"] Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.964236 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-m2s62" event={"ID":"fdb73657-7045-4536-b856-81fcc6da6718","Type":"ContainerDied","Data":"845b91539d63df41d4b814a0d8bde2491b31caef80995ab7973d2ecb39765e58"} Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.964267 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="845b91539d63df41d4b814a0d8bde2491b31caef80995ab7973d2ecb39765e58" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.964347 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.969106 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.977931 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pvcw8" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.978132 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.978239 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 17:22:23 crc kubenswrapper[4681]: I1007 17:22:23.978393 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.066880 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-scripts\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.066971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-public-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.066991 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbx4\" (UniqueName: \"kubernetes.io/projected/0d112a4c-ca20-4593-ac26-4e88a56ca00a-kube-api-access-8vbx4\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.067083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d112a4c-ca20-4593-ac26-4e88a56ca00a-logs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.067097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-config-data\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.067169 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-combined-ca-bundle\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.067192 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-internal-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168397 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-internal-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168460 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-scripts\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168504 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-public-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168531 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbx4\" (UniqueName: \"kubernetes.io/projected/0d112a4c-ca20-4593-ac26-4e88a56ca00a-kube-api-access-8vbx4\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168629 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d112a4c-ca20-4593-ac26-4e88a56ca00a-logs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168653 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-config-data\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.168727 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-combined-ca-bundle\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.169530 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d112a4c-ca20-4593-ac26-4e88a56ca00a-logs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.173904 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-public-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.174788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-combined-ca-bundle\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.182279 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-internal-tls-certs\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.182405 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-scripts\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.182443 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d112a4c-ca20-4593-ac26-4e88a56ca00a-config-data\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.185804 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbx4\" (UniqueName: \"kubernetes.io/projected/0d112a4c-ca20-4593-ac26-4e88a56ca00a-kube-api-access-8vbx4\") pod \"placement-68578dd4f6-bzx29\" (UID: \"0d112a4c-ca20-4593-ac26-4e88a56ca00a\") " pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.343152 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.980563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerStarted","Data":"5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e"} Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.980785 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-log" containerID="cri-o://36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29" gracePeriod=30 Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.981189 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-httpd" containerID="cri-o://5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e" gracePeriod=30 Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.986854 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerStarted","Data":"71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d"} Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.986986 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-httpd" containerID="cri-o://71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d" gracePeriod=30 Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.986990 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-log" containerID="cri-o://a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032" gracePeriod=30 Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.993998 4681 generic.go:334] "Generic (PLEG): container finished" podID="9405f877-b9a6-4d64-92f1-df500e73046f" containerID="cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e" exitCode=0 Oct 07 17:22:24 crc kubenswrapper[4681]: I1007 17:22:24.994038 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2qcd" event={"ID":"9405f877-b9a6-4d64-92f1-df500e73046f","Type":"ContainerDied","Data":"cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e"} Oct 07 17:22:25 crc kubenswrapper[4681]: I1007 17:22:25.012685 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68578dd4f6-bzx29"] Oct 07 17:22:25 crc kubenswrapper[4681]: I1007 17:22:25.034652 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.03463379 podStartE2EDuration="9.03463379s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:25.005050748 +0000 UTC m=+1148.652462303" watchObservedRunningTime="2025-10-07 17:22:25.03463379 +0000 UTC m=+1148.682045355" Oct 07 17:22:25 crc kubenswrapper[4681]: I1007 17:22:25.051404 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.051387356 podStartE2EDuration="9.051387356s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:25.049174754 +0000 UTC m=+1148.696586309" watchObservedRunningTime="2025-10-07 17:22:25.051387356 +0000 UTC m=+1148.698798911" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.033234 4681 generic.go:334] "Generic (PLEG): container finished" podID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerID="71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d" exitCode=0 Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.033701 4681 generic.go:334] "Generic (PLEG): container finished" podID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerID="a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032" exitCode=143 Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.033714 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerDied","Data":"71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d"} Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.033772 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerDied","Data":"a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032"} Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.041217 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68578dd4f6-bzx29" event={"ID":"0d112a4c-ca20-4593-ac26-4e88a56ca00a","Type":"ContainerStarted","Data":"a2d6ca4db6a58a54e5dbd81fb01e4bf2175b9d0175b1e1a3f2684b0172171637"} Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.041251 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68578dd4f6-bzx29" event={"ID":"0d112a4c-ca20-4593-ac26-4e88a56ca00a","Type":"ContainerStarted","Data":"44e02e8660292afa2a9ab15192f32fc84745ddd0ec746e3b40cfb29018a330d9"} Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.054072 4681 generic.go:334] "Generic (PLEG): container finished" podID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerID="36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29" exitCode=143 Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.054249 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerDied","Data":"36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29"} Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.203956 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330431 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330477 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330610 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330634 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330649 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330698 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.330757 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m4wv\" (UniqueName: \"kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv\") pod \"60162389-9c6d-4611-94f1-16baa5b1adcc\" (UID: \"60162389-9c6d-4611-94f1-16baa5b1adcc\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.333562 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.361143 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv" (OuterVolumeSpecName: "kube-api-access-9m4wv") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "kube-api-access-9m4wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.361438 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs" (OuterVolumeSpecName: "logs") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.365061 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.376171 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts" (OuterVolumeSpecName: "scripts") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.414985 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434037 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434077 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m4wv\" (UniqueName: \"kubernetes.io/projected/60162389-9c6d-4611-94f1-16baa5b1adcc-kube-api-access-9m4wv\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434113 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434122 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.434131 4681 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/60162389-9c6d-4611-94f1-16baa5b1adcc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.443053 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data" (OuterVolumeSpecName: "config-data") pod "60162389-9c6d-4611-94f1-16baa5b1adcc" (UID: "60162389-9c6d-4611-94f1-16baa5b1adcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.471566 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.484399 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.538192 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60162389-9c6d-4611-94f1-16baa5b1adcc-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.538241 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639136 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxl5d\" (UniqueName: \"kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639183 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639250 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639304 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639360 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.639489 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts\") pod \"9405f877-b9a6-4d64-92f1-df500e73046f\" (UID: \"9405f877-b9a6-4d64-92f1-df500e73046f\") " Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.649554 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d" (OuterVolumeSpecName: "kube-api-access-bxl5d") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "kube-api-access-bxl5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.651207 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.651404 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts" (OuterVolumeSpecName: "scripts") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.667261 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.671126 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data" (OuterVolumeSpecName: "config-data") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.680010 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9405f877-b9a6-4d64-92f1-df500e73046f" (UID: "9405f877-b9a6-4d64-92f1-df500e73046f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749063 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749256 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxl5d\" (UniqueName: \"kubernetes.io/projected/9405f877-b9a6-4d64-92f1-df500e73046f-kube-api-access-bxl5d\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749345 4681 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749400 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749451 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:26 crc kubenswrapper[4681]: I1007 17:22:26.749531 4681 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9405f877-b9a6-4d64-92f1-df500e73046f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.131386 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68578dd4f6-bzx29" event={"ID":"0d112a4c-ca20-4593-ac26-4e88a56ca00a","Type":"ContainerStarted","Data":"ab3d16133d5f8868b929d9d514d79f26d613473a66c193f8c9ea552390589c77"} Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.132600 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.132666 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.161185 4681 generic.go:334] "Generic (PLEG): container finished" podID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerID="5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e" exitCode=0 Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.161284 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerDied","Data":"5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e"} Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.189488 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68578dd4f6-bzx29" podStartSLOduration=4.189101236 podStartE2EDuration="4.189101236s" podCreationTimestamp="2025-10-07 17:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:27.168600456 +0000 UTC m=+1150.816012011" watchObservedRunningTime="2025-10-07 17:22:27.189101236 +0000 UTC m=+1150.836512791" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.196625 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"60162389-9c6d-4611-94f1-16baa5b1adcc","Type":"ContainerDied","Data":"914493f6543cc0ee1c62c1780178ca82c5021ae085f2646e1289c8526f82a2ce"} Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.196686 4681 scope.go:117] "RemoveContainer" containerID="71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.196841 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.227272 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k2qcd" event={"ID":"9405f877-b9a6-4d64-92f1-df500e73046f","Type":"ContainerDied","Data":"e40b54390b87280b0f67a24dc80b8d822fc60bd3344103cc599a5c98d7c357d8"} Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.227311 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e40b54390b87280b0f67a24dc80b8d822fc60bd3344103cc599a5c98d7c357d8" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.227392 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k2qcd" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.232955 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7db8ffcf86-wnnfn"] Oct 07 17:22:27 crc kubenswrapper[4681]: E1007 17:22:27.233538 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-log" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.233605 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-log" Oct 07 17:22:27 crc kubenswrapper[4681]: E1007 17:22:27.233658 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9405f877-b9a6-4d64-92f1-df500e73046f" containerName="keystone-bootstrap" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.233715 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9405f877-b9a6-4d64-92f1-df500e73046f" containerName="keystone-bootstrap" Oct 07 17:22:27 crc kubenswrapper[4681]: E1007 17:22:27.233777 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-httpd" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.233823 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-httpd" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.234078 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-httpd" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.234161 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9405f877-b9a6-4d64-92f1-df500e73046f" containerName="keystone-bootstrap" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.234234 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" containerName="glance-log" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.235345 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.254216 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.254250 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.265018 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.281969 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db8ffcf86-wnnfn"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.295535 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.342026 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.347438 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.347466 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.350729 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.351545 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364307 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-fernet-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364376 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-scripts\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqrz\" (UniqueName: \"kubernetes.io/projected/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-kube-api-access-nsqrz\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364437 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-config-data\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364465 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-credential-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364489 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-combined-ca-bundle\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364506 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-internal-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.364543 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-public-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.399069 4681 scope.go:117] "RemoveContainer" containerID="a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.412635 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.441544 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.442228 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.443356 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469356 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469461 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-fernet-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469597 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469680 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-scripts\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469713 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqrz\" (UniqueName: \"kubernetes.io/projected/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-kube-api-access-nsqrz\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469740 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469777 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-config-data\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469904 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vnsd\" (UniqueName: \"kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469944 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-credential-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.469972 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.470005 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-combined-ca-bundle\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.470022 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-internal-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.470036 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.470121 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-public-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.470189 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.482903 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-fernet-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.486664 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-config-data\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.490563 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-combined-ca-bundle\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.506534 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-public-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.508293 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-internal-tls-certs\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.509982 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-scripts\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.513440 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqrz\" (UniqueName: \"kubernetes.io/projected/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-kube-api-access-nsqrz\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.513455 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd-credential-keys\") pod \"keystone-7db8ffcf86-wnnfn\" (UID: \"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd\") " pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.535555 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.571825 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkcv4\" (UniqueName: \"kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572026 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572049 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572113 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572234 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572278 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle\") pod \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\" (UID: \"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f\") " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572576 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572613 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572643 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572687 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vnsd\" (UniqueName: \"kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572743 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572849 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572874 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.572910 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.573102 4681 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.576361 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs" (OuterVolumeSpecName: "logs") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.576791 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.577036 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.579247 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.579788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.590619 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.591298 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.591626 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts" (OuterVolumeSpecName: "scripts") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.624744 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vnsd\" (UniqueName: \"kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.629558 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4" (OuterVolumeSpecName: "kube-api-access-jkcv4") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "kube-api-access-jkcv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.632020 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.634688 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.639540 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.639808 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="dnsmasq-dns" containerID="cri-o://c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b" gracePeriod=10 Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.657602 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.679300 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.679346 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.679360 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.679374 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkcv4\" (UniqueName: \"kubernetes.io/projected/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-kube-api-access-jkcv4\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.699354 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.713162 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.739224 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.814559 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.888540 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.922513 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.963733 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data" (OuterVolumeSpecName: "config-data") pod "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" (UID: "4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.995443 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:27 crc kubenswrapper[4681]: I1007 17:22:27.995473 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.277422 4681 generic.go:334] "Generic (PLEG): container finished" podID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerID="c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b" exitCode=0 Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.277484 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" event={"ID":"e5f57f38-603f-48d1-9326-8b5183fe99ae","Type":"ContainerDied","Data":"c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b"} Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.295514 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f","Type":"ContainerDied","Data":"fb26b1cddade9bd34af28fb8d3f6355abc77718060f74d2fe71029bcc0f950e9"} Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.295795 4681 scope.go:117] "RemoveContainer" containerID="5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.296038 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.369020 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.386420 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.391401 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:28 crc kubenswrapper[4681]: E1007 17:22:28.393108 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-log" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.393132 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-log" Oct 07 17:22:28 crc kubenswrapper[4681]: E1007 17:22:28.393167 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-httpd" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.393173 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-httpd" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.393397 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-log" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.393414 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" containerName="glance-httpd" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.394379 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.405182 4681 scope.go:117] "RemoveContainer" containerID="36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.405653 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.406068 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.414190 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.496363 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508364 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508407 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508472 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508516 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508540 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlp8\" (UniqueName: \"kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508559 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508587 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.508608 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610021 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610164 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610184 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhnkx\" (UniqueName: \"kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610263 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610327 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610373 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc\") pod \"e5f57f38-603f-48d1-9326-8b5183fe99ae\" (UID: \"e5f57f38-603f-48d1-9326-8b5183fe99ae\") " Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610616 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610673 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610696 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlp8\" (UniqueName: \"kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610720 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610749 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610771 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610824 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.610839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.611419 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.612505 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.614650 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.627731 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.629734 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.630255 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.630833 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.680500 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx" (OuterVolumeSpecName: "kube-api-access-xhnkx") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "kube-api-access-xhnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.698483 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlp8\" (UniqueName: \"kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.715447 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhnkx\" (UniqueName: \"kubernetes.io/projected/e5f57f38-603f-48d1-9326-8b5183fe99ae-kube-api-access-xhnkx\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.854396 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " pod="openstack/glance-default-external-api-0" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.870392 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.878529 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config" (OuterVolumeSpecName: "config") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.904232 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.930524 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.930558 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.930567 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.950409 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.960864 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5f57f38-603f-48d1-9326-8b5183fe99ae" (UID: "e5f57f38-603f-48d1-9326-8b5183fe99ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:22:28 crc kubenswrapper[4681]: I1007 17:22:28.998063 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db8ffcf86-wnnfn"] Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.033052 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.033086 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f57f38-603f-48d1-9326-8b5183fe99ae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.040669 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f" path="/var/lib/kubelet/pods/4b6eda5c-8ecf-4ed8-bea7-5609a9b8184f/volumes" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.041431 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60162389-9c6d-4611-94f1-16baa5b1adcc" path="/var/lib/kubelet/pods/60162389-9c6d-4611-94f1-16baa5b1adcc/volumes" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.074384 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.090495 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.384198 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" event={"ID":"e5f57f38-603f-48d1-9326-8b5183fe99ae","Type":"ContainerDied","Data":"02bfbf595d3d2df0ac70d82ba0d88fc21be259055301337480332113b56ddbca"} Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.384498 4681 scope.go:117] "RemoveContainer" containerID="c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.384595 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.423012 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.433845 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-bhfl5"] Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.450054 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerStarted","Data":"bba5289ac4634854a786a43e67abd77dde6e0795513de9ed80e980c83bfc9a59"} Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.469702 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db8ffcf86-wnnfn" event={"ID":"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd","Type":"ContainerStarted","Data":"cefad19e62837a2ba764989bcfa8ae03667259437e9779a2b4d78123ac2fb1fd"} Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.478361 4681 scope.go:117] "RemoveContainer" containerID="4eb16e791f50e5c3058388e2e1bb9d8e83ceb4e7aebaca2f0cc24b1fa829ca77" Oct 07 17:22:29 crc kubenswrapper[4681]: I1007 17:22:29.820150 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:22:30 crc kubenswrapper[4681]: I1007 17:22:30.535009 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerStarted","Data":"62699061cdcd57b0ac85f0b1816a2bd868995131a837bc40f529bf7450dcec8e"} Oct 07 17:22:30 crc kubenswrapper[4681]: I1007 17:22:30.552278 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerStarted","Data":"d851e1412b7f356617810ff6ab0e97cc582e7cb3ff99afe06d288fbfeb8bceee"} Oct 07 17:22:30 crc kubenswrapper[4681]: I1007 17:22:30.564233 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db8ffcf86-wnnfn" event={"ID":"ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd","Type":"ContainerStarted","Data":"42ea8a400b4bb8ebf0aa1eb6fb7195c8b97ebadda03500c6010e8c2f531c6308"} Oct 07 17:22:30 crc kubenswrapper[4681]: I1007 17:22:30.565302 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:22:30 crc kubenswrapper[4681]: I1007 17:22:30.602371 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7db8ffcf86-wnnfn" podStartSLOduration=3.602349986 podStartE2EDuration="3.602349986s" podCreationTimestamp="2025-10-07 17:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:30.593611584 +0000 UTC m=+1154.241023139" watchObservedRunningTime="2025-10-07 17:22:30.602349986 +0000 UTC m=+1154.249761541" Oct 07 17:22:31 crc kubenswrapper[4681]: I1007 17:22:31.044695 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" path="/var/lib/kubelet/pods/e5f57f38-603f-48d1-9326-8b5183fe99ae/volumes" Oct 07 17:22:31 crc kubenswrapper[4681]: I1007 17:22:31.591917 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerStarted","Data":"6dad84ce9fe24dcae7da413e3f27702c59c553e8c354b26b7b553462b5470432"} Oct 07 17:22:31 crc kubenswrapper[4681]: I1007 17:22:31.609806 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerStarted","Data":"3fef274458acb4055c335ae5b276d774add2dad4871ff025ea707ec039377925"} Oct 07 17:22:31 crc kubenswrapper[4681]: I1007 17:22:31.631029 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.6310036199999995 podStartE2EDuration="4.63100362s" podCreationTimestamp="2025-10-07 17:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:22:31.617565477 +0000 UTC m=+1155.264977032" watchObservedRunningTime="2025-10-07 17:22:31.63100362 +0000 UTC m=+1155.278415185" Oct 07 17:22:33 crc kubenswrapper[4681]: I1007 17:22:33.201042 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-bhfl5" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: i/o timeout" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.441270 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.618212 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.739636 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.739679 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.811317 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:37 crc kubenswrapper[4681]: I1007 17:22:37.823445 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:38 crc kubenswrapper[4681]: I1007 17:22:38.669584 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:38 crc kubenswrapper[4681]: I1007 17:22:38.669642 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:42 crc kubenswrapper[4681]: E1007 17:22:42.191135 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 07 17:22:42 crc kubenswrapper[4681]: E1007 17:22:42.191730 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jghfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-lvsnj_openstack(98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:22:42 crc kubenswrapper[4681]: E1007 17:22:42.193010 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-lvsnj" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" Oct 07 17:22:42 crc kubenswrapper[4681]: E1007 17:22:42.709795 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-lvsnj" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.620648 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-conmon-36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-conmon-36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.621988 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-conmon-a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-conmon-a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622017 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-36235952a3177035a7654fa92bdd7294d5336ad0439a2f3b5eeeb3b58ad04f29.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622030 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-a822fbc2d0606b3bfdb1cdb22adbf582a92d706e3f1ba29b93d26e13a00fd032.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622046 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-conmon-5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-conmon-5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622060 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-conmon-71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-conmon-71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622075 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-5618ecb4c3cf0655dc339feb2b848def218cca8f7380f324cd47daee50567a8e.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: W1007 17:22:46.622087 4681 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-71f3de975f8a8db69a439744ceb7374f7cfbf2344337397318ae90074e35a62d.scope: no such file or directory Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.749951 4681 generic.go:334] "Generic (PLEG): container finished" podID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerID="71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.749991 4681 generic.go:334] "Generic (PLEG): container finished" podID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerID="a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.750053 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerDied","Data":"71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783"} Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.750088 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerDied","Data":"a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98"} Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.755494 4681 generic.go:334] "Generic (PLEG): container finished" podID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerID="0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.755536 4681 generic.go:334] "Generic (PLEG): container finished" podID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerID="e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.755594 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerDied","Data":"0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913"} Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.755632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerDied","Data":"e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d"} Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.763812 4681 generic.go:334] "Generic (PLEG): container finished" podID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerID="8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.763844 4681 generic.go:334] "Generic (PLEG): container finished" podID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerID="13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502" exitCode=137 Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.763864 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerDied","Data":"8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9"} Oct 07 17:22:46 crc kubenswrapper[4681]: I1007 17:22:46.763915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerDied","Data":"13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502"} Oct 07 17:22:46 crc kubenswrapper[4681]: E1007 17:22:46.895849 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice/crio-914493f6543cc0ee1c62c1780178ca82c5021ae085f2646e1289c8526f82a2ce\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eee8e4e_e688_44f8_aff9_44b5f57e1b68.slice/crio-e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f57f38_603f_48d1_9326_8b5183fe99ae.slice/crio-c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eee8e4e_e688_44f8_aff9_44b5f57e1b68.slice/crio-conmon-0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9405f877_b9a6_4d64_92f1_df500e73046f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67fd19_2e0a_4afa_a595_fce5c29c3f18.slice/crio-8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb73657_7045_4536_b856_81fcc6da6718.slice/crio-845b91539d63df41d4b814a0d8bde2491b31caef80995ab7973d2ecb39765e58\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b6eda5c_8ecf_4ed8_bea7_5609a9b8184f.slice/crio-fb26b1cddade9bd34af28fb8d3f6355abc77718060f74d2fe71029bcc0f950e9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14c0f1b_a6a9_4e80_a975_890aac3dcd0e.slice/crio-conmon-71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14c0f1b_a6a9_4e80_a975_890aac3dcd0e.slice/crio-71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb73657_7045_4536_b856_81fcc6da6718.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f57f38_603f_48d1_9326_8b5183fe99ae.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14c0f1b_a6a9_4e80_a975_890aac3dcd0e.slice/crio-conmon-a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f57f38_603f_48d1_9326_8b5183fe99ae.slice/crio-conmon-c7ad863ed46cafc8f2e967f164546d55da677e0599f1b018bc6905b29839408b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67fd19_2e0a_4afa_a595_fce5c29c3f18.slice/crio-13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9405f877_b9a6_4d64_92f1_df500e73046f.slice/crio-conmon-cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9405f877_b9a6_4d64_92f1_df500e73046f.slice/crio-e40b54390b87280b0f67a24dc80b8d822fc60bd3344103cc599a5c98d7c357d8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60162389_9c6d_4611_94f1_16baa5b1adcc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9405f877_b9a6_4d64_92f1_df500e73046f.slice/crio-cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67fd19_2e0a_4afa_a595_fce5c29c3f18.slice/crio-conmon-8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67fd19_2e0a_4afa_a595_fce5c29c3f18.slice/crio-conmon-13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14c0f1b_a6a9_4e80_a975_890aac3dcd0e.slice/crio-a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eee8e4e_e688_44f8_aff9_44b5f57e1b68.slice/crio-conmon-e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d.scope\": RecentStats: unable to find data in memory cache]" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.441029 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.441409 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.442866 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b"} pod="openstack/horizon-64677bd694-6xgb2" containerMessage="Container horizon failed startup probe, will be restarted" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.443015 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" containerID="cri-o://a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b" gracePeriod=30 Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.617919 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.618161 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.619157 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed"} pod="openstack/horizon-f945f854d-hm49c" containerMessage="Container horizon failed startup probe, will be restarted" Oct 07 17:22:47 crc kubenswrapper[4681]: I1007 17:22:47.619274 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" containerID="cri-o://9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed" gracePeriod=30 Oct 07 17:22:50 crc kubenswrapper[4681]: I1007 17:22:50.748474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 17:22:50 crc kubenswrapper[4681]: I1007 17:22:50.749331 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:22:51 crc kubenswrapper[4681]: I1007 17:22:51.112104 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:03 crc kubenswrapper[4681]: E1007 17:23:03.297211 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 17:23:03 crc kubenswrapper[4681]: E1007 17:23:03.297903 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qftrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-x9lv2_openstack(a53e8384-cd97-4cec-ae70-918f86112a99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:23:03 crc kubenswrapper[4681]: E1007 17:23:03.302346 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-x9lv2" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" Oct 07 17:23:03 crc kubenswrapper[4681]: I1007 17:23:03.599185 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7db8ffcf86-wnnfn" Oct 07 17:23:03 crc kubenswrapper[4681]: E1007 17:23:03.926481 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-x9lv2" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" Oct 07 17:23:04 crc kubenswrapper[4681]: I1007 17:23:04.106587 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.097480 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68578dd4f6-bzx29" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.324755 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 17:23:05 crc kubenswrapper[4681]: E1007 17:23:05.327926 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="init" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.327953 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="init" Oct 07 17:23:05 crc kubenswrapper[4681]: E1007 17:23:05.327964 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="dnsmasq-dns" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.327970 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="dnsmasq-dns" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.328162 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f57f38-603f-48d1-9326-8b5183fe99ae" containerName="dnsmasq-dns" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.328746 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.331511 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.331568 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2ng7g" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.331751 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.341129 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.390236 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.390341 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config-secret\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.390396 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpzj\" (UniqueName: \"kubernetes.io/projected/a253ef31-4d02-4fbd-8842-cf2fbe41f307-kube-api-access-9gpzj\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.390496 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.491050 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.491111 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.491159 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config-secret\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.491205 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpzj\" (UniqueName: \"kubernetes.io/projected/a253ef31-4d02-4fbd-8842-cf2fbe41f307-kube-api-access-9gpzj\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.492775 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.498780 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.517686 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpzj\" (UniqueName: \"kubernetes.io/projected/a253ef31-4d02-4fbd-8842-cf2fbe41f307-kube-api-access-9gpzj\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.520768 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a253ef31-4d02-4fbd-8842-cf2fbe41f307-openstack-config-secret\") pod \"openstackclient\" (UID: \"a253ef31-4d02-4fbd-8842-cf2fbe41f307\") " pod="openstack/openstackclient" Oct 07 17:23:05 crc kubenswrapper[4681]: I1007 17:23:05.670232 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.526337 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.538007 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667148 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzxrt\" (UniqueName: \"kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt\") pod \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667320 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data\") pod \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667350 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs\") pod \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667404 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs\") pod \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667457 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key\") pod \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.667486 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkcmv\" (UniqueName: \"kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv\") pod \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668231 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data\") pod \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668261 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts\") pod \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668092 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs" (OuterVolumeSpecName: "logs") pod "4c67fd19-2e0a-4afa-a595-fce5c29c3f18" (UID: "4c67fd19-2e0a-4afa-a595-fce5c29c3f18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668175 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs" (OuterVolumeSpecName: "logs") pod "4eee8e4e-e688-44f8-aff9-44b5f57e1b68" (UID: "4eee8e4e-e688-44f8-aff9-44b5f57e1b68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668759 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts\") pod \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\" (UID: \"4c67fd19-2e0a-4afa-a595-fce5c29c3f18\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.668789 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key\") pod \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\" (UID: \"4eee8e4e-e688-44f8-aff9-44b5f57e1b68\") " Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.669272 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.669289 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.677702 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv" (OuterVolumeSpecName: "kube-api-access-mkcmv") pod "4c67fd19-2e0a-4afa-a595-fce5c29c3f18" (UID: "4c67fd19-2e0a-4afa-a595-fce5c29c3f18"). InnerVolumeSpecName "kube-api-access-mkcmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.677871 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4eee8e4e-e688-44f8-aff9-44b5f57e1b68" (UID: "4eee8e4e-e688-44f8-aff9-44b5f57e1b68"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.678195 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4c67fd19-2e0a-4afa-a595-fce5c29c3f18" (UID: "4c67fd19-2e0a-4afa-a595-fce5c29c3f18"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.700283 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt" (OuterVolumeSpecName: "kube-api-access-hzxrt") pod "4eee8e4e-e688-44f8-aff9-44b5f57e1b68" (UID: "4eee8e4e-e688-44f8-aff9-44b5f57e1b68"). InnerVolumeSpecName "kube-api-access-hzxrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.710368 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts" (OuterVolumeSpecName: "scripts") pod "4c67fd19-2e0a-4afa-a595-fce5c29c3f18" (UID: "4c67fd19-2e0a-4afa-a595-fce5c29c3f18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.713228 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts" (OuterVolumeSpecName: "scripts") pod "4eee8e4e-e688-44f8-aff9-44b5f57e1b68" (UID: "4eee8e4e-e688-44f8-aff9-44b5f57e1b68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.715188 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data" (OuterVolumeSpecName: "config-data") pod "4eee8e4e-e688-44f8-aff9-44b5f57e1b68" (UID: "4eee8e4e-e688-44f8-aff9-44b5f57e1b68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.753146 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data" (OuterVolumeSpecName: "config-data") pod "4c67fd19-2e0a-4afa-a595-fce5c29c3f18" (UID: "4c67fd19-2e0a-4afa-a595-fce5c29c3f18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:09 crc kubenswrapper[4681]: E1007 17:23:09.756865 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 07 17:23:09 crc kubenswrapper[4681]: E1007 17:23:09.757011 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcr7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4673f09e-2140-4dc5-ac9d-af616ddba08d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.771724 4681 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772041 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzxrt\" (UniqueName: \"kubernetes.io/projected/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-kube-api-access-hzxrt\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772059 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772071 4681 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772084 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkcmv\" (UniqueName: \"kubernetes.io/projected/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-kube-api-access-mkcmv\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772097 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772110 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eee8e4e-e688-44f8-aff9-44b5f57e1b68-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.772122 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c67fd19-2e0a-4afa-a595-fce5c29c3f18-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.983401 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76c6b58665-pvxbw" event={"ID":"4c67fd19-2e0a-4afa-a595-fce5c29c3f18","Type":"ContainerDied","Data":"4b225f8e8ee9f70e8221d3d20becacf540be2a5fef729af2c3fcbb925c032045"} Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.983454 4681 scope.go:117] "RemoveContainer" containerID="8b340fbf22bcedbe7dacd4b8cd868085dcbfc8c8c0da3a84cd19f8b5bbfdf7e9" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.983595 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76c6b58665-pvxbw" Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.994854 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbd487785-qh8xz" event={"ID":"4eee8e4e-e688-44f8-aff9-44b5f57e1b68","Type":"ContainerDied","Data":"8af6da5c7db8e532e40eef92fc9a9e31de3a4bbd57d1b2e32ed7ff0d43e5aa2b"} Oct 07 17:23:09 crc kubenswrapper[4681]: I1007 17:23:09.994995 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbd487785-qh8xz" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.057850 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.075671 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76c6b58665-pvxbw"] Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.098489 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.116421 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6bbd487785-qh8xz"] Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.243009 4681 scope.go:117] "RemoveContainer" containerID="13b916f178d0bb19a6190b7678f715a9f137a971c01560d907c210fcf71dd502" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.254507 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 17:23:10 crc kubenswrapper[4681]: W1007 17:23:10.259123 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda253ef31_4d02_4fbd_8842_cf2fbe41f307.slice/crio-fd8c73a58d370072102972654b1abfa65f3cb07699efb27cbcd2ac0e82aa21fd WatchSource:0}: Error finding container fd8c73a58d370072102972654b1abfa65f3cb07699efb27cbcd2ac0e82aa21fd: Status 404 returned error can't find the container with id fd8c73a58d370072102972654b1abfa65f3cb07699efb27cbcd2ac0e82aa21fd Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.260629 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.274076 4681 scope.go:117] "RemoveContainer" containerID="0eec6b6b2c006c550724278ee4876af6bcb6e713b749e75e6810a21c4a5b1913" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.278845 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs\") pod \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.278999 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6pg\" (UniqueName: \"kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg\") pod \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.279039 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key\") pod \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.279089 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts\") pod \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.279119 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data\") pod \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\" (UID: \"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e\") " Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.279182 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs" (OuterVolumeSpecName: "logs") pod "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" (UID: "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.279434 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.289002 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" (UID: "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.290072 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg" (OuterVolumeSpecName: "kube-api-access-qj6pg") pod "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" (UID: "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e"). InnerVolumeSpecName "kube-api-access-qj6pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.317617 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data" (OuterVolumeSpecName: "config-data") pod "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" (UID: "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.319378 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts" (OuterVolumeSpecName: "scripts") pod "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" (UID: "e14c0f1b-a6a9-4e80-a975-890aac3dcd0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.380838 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6pg\" (UniqueName: \"kubernetes.io/projected/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-kube-api-access-qj6pg\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.381148 4681 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.381241 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.381326 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:10 crc kubenswrapper[4681]: I1007 17:23:10.479335 4681 scope.go:117] "RemoveContainer" containerID="e72a348d333ad2ee6f8d1e2f2fa1710ced71845a803b980aca00d8c13999706d" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.004623 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerStarted","Data":"4e784eb8313d9dbd5e194c9377e7fee36d0baeebe3a57813fbb922bb57706e00"} Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.009160 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8459b45747-n55dk" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.009159 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8459b45747-n55dk" event={"ID":"e14c0f1b-a6a9-4e80-a975-890aac3dcd0e","Type":"ContainerDied","Data":"c62abcd3f0042ec372b93b90b4f0185d8234e7bf0d016b6be567fd5b1f424ee6"} Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.009316 4681 scope.go:117] "RemoveContainer" containerID="71d5ff623f8b05189ca6d0b08c6b99727edaffa2405dbc5b4965eb029cb64783" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.010427 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a253ef31-4d02-4fbd-8842-cf2fbe41f307","Type":"ContainerStarted","Data":"fd8c73a58d370072102972654b1abfa65f3cb07699efb27cbcd2ac0e82aa21fd"} Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.042870 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=43.042849724 podStartE2EDuration="43.042849724s" podCreationTimestamp="2025-10-07 17:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:11.030321135 +0000 UTC m=+1194.677732690" watchObservedRunningTime="2025-10-07 17:23:11.042849724 +0000 UTC m=+1194.690261299" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.045519 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" path="/var/lib/kubelet/pods/4c67fd19-2e0a-4afa-a595-fce5c29c3f18/volumes" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.046447 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" path="/var/lib/kubelet/pods/4eee8e4e-e688-44f8-aff9-44b5f57e1b68/volumes" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.058992 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.066115 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8459b45747-n55dk"] Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407394 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-58b7954b47-8j9j9"] Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407735 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407747 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407757 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407762 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407777 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407784 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407806 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407812 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407822 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407828 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: E1007 17:23:11.407839 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.407844 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408047 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408068 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408088 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408101 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c67fd19-2e0a-4afa-a595-fce5c29c3f18" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408120 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eee8e4e-e688-44f8-aff9-44b5f57e1b68" containerName="horizon" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.408131 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" containerName="horizon-log" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.409195 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.411837 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.417535 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.428903 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58b7954b47-8j9j9"] Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.429753 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498040 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-etc-swift\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498091 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-run-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498114 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-config-data\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498155 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-internal-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498182 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-combined-ca-bundle\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498200 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-public-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-log-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.498311 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dkh8\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-kube-api-access-9dkh8\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599629 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-etc-swift\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599675 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-run-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599692 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-config-data\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599751 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-internal-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-combined-ca-bundle\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599798 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-public-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599820 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-log-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.599867 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dkh8\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-kube-api-access-9dkh8\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.601077 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-log-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.601107 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/642b1a07-3c90-40b5-b6cb-af1d8832649b-run-httpd\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.610200 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-etc-swift\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.610268 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-public-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.619935 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-internal-tls-certs\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.628714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dkh8\" (UniqueName: \"kubernetes.io/projected/642b1a07-3c90-40b5-b6cb-af1d8832649b-kube-api-access-9dkh8\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.628850 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-config-data\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.631117 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/642b1a07-3c90-40b5-b6cb-af1d8832649b-combined-ca-bundle\") pod \"swift-proxy-58b7954b47-8j9j9\" (UID: \"642b1a07-3c90-40b5-b6cb-af1d8832649b\") " pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.637473 4681 scope.go:117] "RemoveContainer" containerID="a03ca3b288d90e815ad9905ddfda88c242050dd2a3f9a5467015708963131c98" Oct 07 17:23:11 crc kubenswrapper[4681]: I1007 17:23:11.727507 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:12 crc kubenswrapper[4681]: I1007 17:23:12.894529 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58b7954b47-8j9j9"] Oct 07 17:23:12 crc kubenswrapper[4681]: W1007 17:23:12.915078 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod642b1a07_3c90_40b5_b6cb_af1d8832649b.slice/crio-ffc97849118cbba55fa3a44be3bbadbab4427dbdef64915268a5763dda593d87 WatchSource:0}: Error finding container ffc97849118cbba55fa3a44be3bbadbab4427dbdef64915268a5763dda593d87: Status 404 returned error can't find the container with id ffc97849118cbba55fa3a44be3bbadbab4427dbdef64915268a5763dda593d87 Oct 07 17:23:13 crc kubenswrapper[4681]: I1007 17:23:13.038697 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14c0f1b-a6a9-4e80-a975-890aac3dcd0e" path="/var/lib/kubelet/pods/e14c0f1b-a6a9-4e80-a975-890aac3dcd0e/volumes" Oct 07 17:23:13 crc kubenswrapper[4681]: I1007 17:23:13.050094 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b7954b47-8j9j9" event={"ID":"642b1a07-3c90-40b5-b6cb-af1d8832649b","Type":"ContainerStarted","Data":"ffc97849118cbba55fa3a44be3bbadbab4427dbdef64915268a5763dda593d87"} Oct 07 17:23:13 crc kubenswrapper[4681]: I1007 17:23:13.053635 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lvsnj" event={"ID":"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6","Type":"ContainerStarted","Data":"202467d9e5207e358cd051d20deafde7b062577ece5c24287b2327a5711be6c3"} Oct 07 17:23:13 crc kubenswrapper[4681]: I1007 17:23:13.079837 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lvsnj" podStartSLOduration=2.263457087 podStartE2EDuration="52.079810157s" podCreationTimestamp="2025-10-07 17:22:21 +0000 UTC" firstStartedPulling="2025-10-07 17:22:22.152776945 +0000 UTC m=+1145.800188500" lastFinishedPulling="2025-10-07 17:23:11.969130015 +0000 UTC m=+1195.616541570" observedRunningTime="2025-10-07 17:23:13.070251942 +0000 UTC m=+1196.717663497" watchObservedRunningTime="2025-10-07 17:23:13.079810157 +0000 UTC m=+1196.727221742" Oct 07 17:23:14 crc kubenswrapper[4681]: I1007 17:23:14.069131 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b7954b47-8j9j9" event={"ID":"642b1a07-3c90-40b5-b6cb-af1d8832649b","Type":"ContainerStarted","Data":"9cc69f1d9c7cf5da90fc5ad5b7a152ddbcd1c1ea626af83aab5db8d012104479"} Oct 07 17:23:14 crc kubenswrapper[4681]: I1007 17:23:14.069453 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58b7954b47-8j9j9" event={"ID":"642b1a07-3c90-40b5-b6cb-af1d8832649b","Type":"ContainerStarted","Data":"f46632cfbca06b8837a560fedde5b395ce2d4b32f595f2ef2faf98aaa81631c9"} Oct 07 17:23:14 crc kubenswrapper[4681]: I1007 17:23:14.069480 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:14 crc kubenswrapper[4681]: I1007 17:23:14.069501 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:14 crc kubenswrapper[4681]: I1007 17:23:14.103401 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-58b7954b47-8j9j9" podStartSLOduration=3.103375649 podStartE2EDuration="3.103375649s" podCreationTimestamp="2025-10-07 17:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:14.094151463 +0000 UTC m=+1197.741563018" watchObservedRunningTime="2025-10-07 17:23:14.103375649 +0000 UTC m=+1197.750787204" Oct 07 17:23:17 crc kubenswrapper[4681]: E1007 17:23:17.673678 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod990e1913_44d7_414b_a116_6b712547fc81.slice/crio-a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a91326_9285_4589_a05b_c0a2c2ed397e.slice/crio-9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed.scope\": RecentStats: unable to find data in memory cache]" Oct 07 17:23:18 crc kubenswrapper[4681]: I1007 17:23:18.109604 4681 generic.go:334] "Generic (PLEG): container finished" podID="990e1913-44d7-414b-a116-6b712547fc81" containerID="a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b" exitCode=137 Oct 07 17:23:18 crc kubenswrapper[4681]: I1007 17:23:18.109963 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerDied","Data":"a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b"} Oct 07 17:23:18 crc kubenswrapper[4681]: I1007 17:23:18.112809 4681 generic.go:334] "Generic (PLEG): container finished" podID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerID="9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed" exitCode=137 Oct 07 17:23:18 crc kubenswrapper[4681]: I1007 17:23:18.112830 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerDied","Data":"9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed"} Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.091577 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.091642 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.128179 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.128777 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.135033 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.957988 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-kvv88"] Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.959408 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:19 crc kubenswrapper[4681]: I1007 17:23:19.973414 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kvv88"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.059120 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-d8nsb"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.060783 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.069657 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmcp\" (UniqueName: \"kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp\") pod \"nova-api-db-create-kvv88\" (UID: \"13c6b5bc-aeb1-47bb-995f-cf7d67007900\") " pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.077226 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8nsb"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.131997 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.171421 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqkk2\" (UniqueName: \"kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2\") pod \"nova-cell0-db-create-d8nsb\" (UID: \"57a8142b-ea3d-4907-8331-885c973462eb\") " pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.171476 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmcp\" (UniqueName: \"kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp\") pod \"nova-api-db-create-kvv88\" (UID: \"13c6b5bc-aeb1-47bb-995f-cf7d67007900\") " pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.190636 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.219107 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmcp\" (UniqueName: \"kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp\") pod \"nova-api-db-create-kvv88\" (UID: \"13c6b5bc-aeb1-47bb-995f-cf7d67007900\") " pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.261942 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nt64g"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.273836 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqkk2\" (UniqueName: \"kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2\") pod \"nova-cell0-db-create-d8nsb\" (UID: \"57a8142b-ea3d-4907-8331-885c973462eb\") " pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.280167 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.280337 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.280605 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nt64g"] Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.341372 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqkk2\" (UniqueName: \"kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2\") pod \"nova-cell0-db-create-d8nsb\" (UID: \"57a8142b-ea3d-4907-8331-885c973462eb\") " pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.375810 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb56k\" (UniqueName: \"kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k\") pod \"nova-cell1-db-create-nt64g\" (UID: \"85befb0e-1557-44bb-b783-f0ea67d38de9\") " pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.378551 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.478026 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb56k\" (UniqueName: \"kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k\") pod \"nova-cell1-db-create-nt64g\" (UID: \"85befb0e-1557-44bb-b783-f0ea67d38de9\") " pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.496022 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb56k\" (UniqueName: \"kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k\") pod \"nova-cell1-db-create-nt64g\" (UID: \"85befb0e-1557-44bb-b783-f0ea67d38de9\") " pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:20 crc kubenswrapper[4681]: I1007 17:23:20.616806 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:21 crc kubenswrapper[4681]: I1007 17:23:21.146767 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:21 crc kubenswrapper[4681]: I1007 17:23:21.736820 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:21 crc kubenswrapper[4681]: I1007 17:23:21.737663 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58b7954b47-8j9j9" Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.162141 4681 generic.go:334] "Generic (PLEG): container finished" podID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" containerID="202467d9e5207e358cd051d20deafde7b062577ece5c24287b2327a5711be6c3" exitCode=0 Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.162239 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lvsnj" event={"ID":"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6","Type":"ContainerDied","Data":"202467d9e5207e358cd051d20deafde7b062577ece5c24287b2327a5711be6c3"} Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.162564 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.162576 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.163136 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" containerID="cri-o://3fef274458acb4055c335ae5b276d774add2dad4871ff025ea707ec039377925" gracePeriod=30 Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.163213 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" containerID="cri-o://4e784eb8313d9dbd5e194c9377e7fee36d0baeebe3a57813fbb922bb57706e00" gracePeriod=30 Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.174186 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": EOF" Oct 07 17:23:22 crc kubenswrapper[4681]: I1007 17:23:22.174485 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": EOF" Oct 07 17:23:23 crc kubenswrapper[4681]: I1007 17:23:23.173929 4681 generic.go:334] "Generic (PLEG): container finished" podID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerID="3fef274458acb4055c335ae5b276d774add2dad4871ff025ea707ec039377925" exitCode=143 Oct 07 17:23:23 crc kubenswrapper[4681]: I1007 17:23:23.174133 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerDied","Data":"3fef274458acb4055c335ae5b276d774add2dad4871ff025ea707ec039377925"} Oct 07 17:23:23 crc kubenswrapper[4681]: I1007 17:23:23.951853 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:23 crc kubenswrapper[4681]: I1007 17:23:23.952179 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-log" containerID="cri-o://62699061cdcd57b0ac85f0b1816a2bd868995131a837bc40f529bf7450dcec8e" gracePeriod=30 Oct 07 17:23:23 crc kubenswrapper[4681]: I1007 17:23:23.952327 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-httpd" containerID="cri-o://6dad84ce9fe24dcae7da413e3f27702c59c553e8c354b26b7b553462b5470432" gracePeriod=30 Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.184693 4681 generic.go:334] "Generic (PLEG): container finished" podID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerID="62699061cdcd57b0ac85f0b1816a2bd868995131a837bc40f529bf7450dcec8e" exitCode=143 Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.184750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerDied","Data":"62699061cdcd57b0ac85f0b1816a2bd868995131a837bc40f529bf7450dcec8e"} Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.794656 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.865434 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data\") pod \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.865480 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle\") pod \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.865519 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jghfx\" (UniqueName: \"kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx\") pod \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\" (UID: \"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6\") " Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.872767 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx" (OuterVolumeSpecName: "kube-api-access-jghfx") pod "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" (UID: "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6"). InnerVolumeSpecName "kube-api-access-jghfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.874490 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" (UID: "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.904855 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" (UID: "98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.967296 4681 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.967341 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:24 crc kubenswrapper[4681]: I1007 17:23:24.967354 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jghfx\" (UniqueName: \"kubernetes.io/projected/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6-kube-api-access-jghfx\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:25 crc kubenswrapper[4681]: I1007 17:23:25.203324 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lvsnj" event={"ID":"98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6","Type":"ContainerDied","Data":"2e2c30d6b1d112ab75eabc9816ab2c1c0d18f88dae97fdde394b629eee3f10b7"} Oct 07 17:23:25 crc kubenswrapper[4681]: I1007 17:23:25.203361 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2c30d6b1d112ab75eabc9816ab2c1c0d18f88dae97fdde394b629eee3f10b7" Oct 07 17:23:25 crc kubenswrapper[4681]: I1007 17:23:25.203371 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lvsnj" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.005192 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d869d8764-5bjtz"] Oct 07 17:23:26 crc kubenswrapper[4681]: E1007 17:23:26.005853 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" containerName="barbican-db-sync" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.005869 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" containerName="barbican-db-sync" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.006093 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" containerName="barbican-db-sync" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.007048 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.009448 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hhdkv" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.010414 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.010641 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.011348 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57b57fb795-6426k"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.015054 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.016984 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.042264 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57b57fb795-6426k"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.068471 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d869d8764-5bjtz"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094024 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094069 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-combined-ca-bundle\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094140 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lndm\" (UniqueName: \"kubernetes.io/projected/62d6d4e2-d1d4-4967-82e9-143266e1165b-kube-api-access-2lndm\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094170 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data-custom\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094186 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f35c1eb1-692d-4484-a686-5ad0ce63744b-logs\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094200 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094303 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hk45\" (UniqueName: \"kubernetes.io/projected/f35c1eb1-692d-4484-a686-5ad0ce63744b-kube-api-access-4hk45\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094370 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data-custom\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.094422 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d6d4e2-d1d4-4967-82e9-143266e1165b-logs\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201182 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data-custom\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201262 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d6d4e2-d1d4-4967-82e9-143266e1165b-logs\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201341 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201365 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201394 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-combined-ca-bundle\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201428 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lndm\" (UniqueName: \"kubernetes.io/projected/62d6d4e2-d1d4-4967-82e9-143266e1165b-kube-api-access-2lndm\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201456 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data-custom\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201472 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f35c1eb1-692d-4484-a686-5ad0ce63744b-logs\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201487 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.201551 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hk45\" (UniqueName: \"kubernetes.io/projected/f35c1eb1-692d-4484-a686-5ad0ce63744b-kube-api-access-4hk45\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.202657 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62d6d4e2-d1d4-4967-82e9-143266e1165b-logs\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.202848 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f35c1eb1-692d-4484-a686-5ad0ce63744b-logs\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.208795 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data-custom\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.211285 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-config-data\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.211766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data-custom\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.212426 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-combined-ca-bundle\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.221323 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.223316 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.242459 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lndm\" (UniqueName: \"kubernetes.io/projected/62d6d4e2-d1d4-4967-82e9-143266e1165b-kube-api-access-2lndm\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.251282 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d6d4e2-d1d4-4967-82e9-143266e1165b-config-data\") pod \"barbican-worker-57b57fb795-6426k\" (UID: \"62d6d4e2-d1d4-4967-82e9-143266e1165b\") " pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.253522 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f35c1eb1-692d-4484-a686-5ad0ce63744b-combined-ca-bundle\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.264936 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.269521 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hk45\" (UniqueName: \"kubernetes.io/projected/f35c1eb1-692d-4484-a686-5ad0ce63744b-kube-api-access-4hk45\") pod \"barbican-keystone-listener-7d869d8764-5bjtz\" (UID: \"f35c1eb1-692d-4484-a686-5ad0ce63744b\") " pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306378 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306444 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306466 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306523 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306540 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.306580 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qk8\" (UniqueName: \"kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.319616 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.333203 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.333859 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.336217 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.342639 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.359438 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57b57fb795-6426k" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.416057 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb9nz\" (UniqueName: \"kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.416250 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.416649 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.417327 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.417368 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.417866 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.420182 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421160 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421255 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59qk8\" (UniqueName: \"kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421284 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421363 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421457 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.422442 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.423082 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.421751 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.427715 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.440293 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qk8\" (UniqueName: \"kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8\") pod \"dnsmasq-dns-7c67bffd47-g59pm\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.522634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.522805 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.522957 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.523058 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.523179 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb9nz\" (UniqueName: \"kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.523088 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.526900 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.527642 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.527982 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.541121 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb9nz\" (UniqueName: \"kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz\") pod \"barbican-api-976bbb468-rxpr4\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.581419 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:54074->10.217.0.154:9292: read: connection reset by peer" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.581577 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:54086->10.217.0.154:9292: read: connection reset by peer" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.687251 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:26 crc kubenswrapper[4681]: I1007 17:23:26.695898 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.258898 4681 generic.go:334] "Generic (PLEG): container finished" podID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerID="4e784eb8313d9dbd5e194c9377e7fee36d0baeebe3a57813fbb922bb57706e00" exitCode=0 Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.258960 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerDied","Data":"4e784eb8313d9dbd5e194c9377e7fee36d0baeebe3a57813fbb922bb57706e00"} Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.262517 4681 generic.go:334] "Generic (PLEG): container finished" podID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerID="6dad84ce9fe24dcae7da413e3f27702c59c553e8c354b26b7b553462b5470432" exitCode=0 Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.262549 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerDied","Data":"6dad84ce9fe24dcae7da413e3f27702c59c553e8c354b26b7b553462b5470432"} Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.742055 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Oct 07 17:23:27 crc kubenswrapper[4681]: I1007 17:23:27.742084 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.862861 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d8b9fbb46-6wjkq"] Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.865980 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.869824 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.871918 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.886732 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d8b9fbb46-6wjkq"] Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980527 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f40489-1614-45c8-864b-2288473c7c1d-logs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980607 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980668 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-public-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980697 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-internal-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980722 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-combined-ca-bundle\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980746 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data-custom\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:28 crc kubenswrapper[4681]: I1007 17:23:28.980771 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htwj\" (UniqueName: \"kubernetes.io/projected/07f40489-1614-45c8-864b-2288473c7c1d-kube-api-access-2htwj\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082554 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082641 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-public-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082669 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-internal-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082693 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-combined-ca-bundle\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082719 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data-custom\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082745 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htwj\" (UniqueName: \"kubernetes.io/projected/07f40489-1614-45c8-864b-2288473c7c1d-kube-api-access-2htwj\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.082783 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f40489-1614-45c8-864b-2288473c7c1d-logs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.083199 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f40489-1614-45c8-864b-2288473c7c1d-logs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.088486 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data-custom\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.089170 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-combined-ca-bundle\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.091465 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": dial tcp 10.217.0.154:9292: connect: connection refused" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.091734 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": dial tcp 10.217.0.154:9292: connect: connection refused" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.107586 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-public-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.112960 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-config-data\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.117096 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htwj\" (UniqueName: \"kubernetes.io/projected/07f40489-1614-45c8-864b-2288473c7c1d-kube-api-access-2htwj\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.126342 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07f40489-1614-45c8-864b-2288473c7c1d-internal-tls-certs\") pod \"barbican-api-7d8b9fbb46-6wjkq\" (UID: \"07f40489-1614-45c8-864b-2288473c7c1d\") " pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:29 crc kubenswrapper[4681]: I1007 17:23:29.187624 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:30 crc kubenswrapper[4681]: I1007 17:23:30.292647 4681 generic.go:334] "Generic (PLEG): container finished" podID="236dd612-86c8-413b-8ec4-c0f2a55fbf9a" containerID="bd9eeaa1933fc5b81841139e6a9f5b6cde0d6a5f14c328f1c1c7a60d9d0d0f73" exitCode=0 Oct 07 17:23:30 crc kubenswrapper[4681]: I1007 17:23:30.292805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ccnch" event={"ID":"236dd612-86c8-413b-8ec4-c0f2a55fbf9a","Type":"ContainerDied","Data":"bd9eeaa1933fc5b81841139e6a9f5b6cde0d6a5f14c328f1c1c7a60d9d0d0f73"} Oct 07 17:23:30 crc kubenswrapper[4681]: I1007 17:23:30.968290 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.130788 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.130899 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vnsd\" (UniqueName: \"kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.130954 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.131028 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.131046 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.131066 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.131106 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.131135 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs\") pod \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\" (UID: \"e420b4e3-f7b8-4d53-8b39-99ae105c3079\") " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.135543 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs" (OuterVolumeSpecName: "logs") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.137374 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.138421 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd" (OuterVolumeSpecName: "kube-api-access-8vnsd") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "kube-api-access-8vnsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.147041 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts" (OuterVolumeSpecName: "scripts") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.156929 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.180045 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.193524 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.209388 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data" (OuterVolumeSpecName: "config-data") pod "e420b4e3-f7b8-4d53-8b39-99ae105c3079" (UID: "e420b4e3-f7b8-4d53-8b39-99ae105c3079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233270 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233308 4681 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233319 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233353 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233363 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e420b4e3-f7b8-4d53-8b39-99ae105c3079-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233372 4681 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233380 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e420b4e3-f7b8-4d53-8b39-99ae105c3079-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.233390 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vnsd\" (UniqueName: \"kubernetes.io/projected/e420b4e3-f7b8-4d53-8b39-99ae105c3079-kube-api-access-8vnsd\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.264556 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.311391 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.313201 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e420b4e3-f7b8-4d53-8b39-99ae105c3079","Type":"ContainerDied","Data":"bba5289ac4634854a786a43e67abd77dde6e0795513de9ed80e980c83bfc9a59"} Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.313272 4681 scope.go:117] "RemoveContainer" containerID="6dad84ce9fe24dcae7da413e3f27702c59c553e8c354b26b7b553462b5470432" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.335422 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.367618 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.377624 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.396265 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:31 crc kubenswrapper[4681]: E1007 17:23:31.396638 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-log" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.396654 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-log" Oct 07 17:23:31 crc kubenswrapper[4681]: E1007 17:23:31.396678 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-httpd" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.396685 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-httpd" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.396848 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-httpd" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.396869 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" containerName="glance-log" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.397727 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.401782 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.401813 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.477022 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547162 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547207 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547243 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547271 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547291 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547327 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547366 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4dzd\" (UniqueName: \"kubernetes.io/projected/1469d2bd-93c0-414a-951e-175bc73f377e-kube-api-access-l4dzd\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.547389 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650025 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650087 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650151 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650189 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650220 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650256 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650305 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4dzd\" (UniqueName: \"kubernetes.io/projected/1469d2bd-93c0-414a-951e-175bc73f377e-kube-api-access-l4dzd\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650329 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.650532 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.651394 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.659807 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1469d2bd-93c0-414a-951e-175bc73f377e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.660351 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.663864 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.665426 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.671078 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1469d2bd-93c0-414a-951e-175bc73f377e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.689805 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4dzd\" (UniqueName: \"kubernetes.io/projected/1469d2bd-93c0-414a-951e-175bc73f377e-kube-api-access-l4dzd\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.697208 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1469d2bd-93c0-414a-951e-175bc73f377e\") " pod="openstack/glance-default-internal-api-0" Oct 07 17:23:31 crc kubenswrapper[4681]: I1007 17:23:31.745305 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:32 crc kubenswrapper[4681]: E1007 17:23:32.087206 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 07 17:23:32 crc kubenswrapper[4681]: E1007 17:23:32.087411 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcr7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4673f09e-2140-4dc5-ac9d-af616ddba08d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 17:23:32 crc kubenswrapper[4681]: E1007 17:23:32.088657 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.393986 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.395465 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a8db576-98ff-44c4-9c62-89332a95ad61","Type":"ContainerDied","Data":"d851e1412b7f356617810ff6ab0e97cc582e7cb3ff99afe06d288fbfeb8bceee"} Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.399911 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-central-agent" containerID="cri-o://47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85" gracePeriod=30 Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.400109 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ccnch" event={"ID":"236dd612-86c8-413b-8ec4-c0f2a55fbf9a","Type":"ContainerDied","Data":"ad9d6ed8f40df2bea9b382369af9f8819e7608e44aae80308a5a7dbea2304339"} Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.400128 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9d6ed8f40df2bea9b382369af9f8819e7608e44aae80308a5a7dbea2304339" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.400166 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-notification-agent" containerID="cri-o://b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c" gracePeriod=30 Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.420599 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ccnch" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.466831 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.466895 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.466915 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.466960 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467044 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle\") pod \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467107 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config\") pod \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467149 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxlp8\" (UniqueName: \"kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467186 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467216 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467237 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nh6q\" (UniqueName: \"kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q\") pod \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\" (UID: \"236dd612-86c8-413b-8ec4-c0f2a55fbf9a\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.467260 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs\") pod \"8a8db576-98ff-44c4-9c62-89332a95ad61\" (UID: \"8a8db576-98ff-44c4-9c62-89332a95ad61\") " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.468389 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs" (OuterVolumeSpecName: "logs") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.470563 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.502940 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q" (OuterVolumeSpecName: "kube-api-access-6nh6q") pod "236dd612-86c8-413b-8ec4-c0f2a55fbf9a" (UID: "236dd612-86c8-413b-8ec4-c0f2a55fbf9a"). InnerVolumeSpecName "kube-api-access-6nh6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.511640 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.519865 4681 scope.go:117] "RemoveContainer" containerID="62699061cdcd57b0ac85f0b1816a2bd868995131a837bc40f529bf7450dcec8e" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.520136 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8" (OuterVolumeSpecName: "kube-api-access-hxlp8") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "kube-api-access-hxlp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.520229 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts" (OuterVolumeSpecName: "scripts") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.573577 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.573892 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nh6q\" (UniqueName: \"kubernetes.io/projected/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-kube-api-access-6nh6q\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.574003 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.574189 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.574283 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxlp8\" (UniqueName: \"kubernetes.io/projected/8a8db576-98ff-44c4-9c62-89332a95ad61-kube-api-access-hxlp8\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.574365 4681 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a8db576-98ff-44c4-9c62-89332a95ad61-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.587298 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config" (OuterVolumeSpecName: "config") pod "236dd612-86c8-413b-8ec4-c0f2a55fbf9a" (UID: "236dd612-86c8-413b-8ec4-c0f2a55fbf9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.589288 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.596836 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "236dd612-86c8-413b-8ec4-c0f2a55fbf9a" (UID: "236dd612-86c8-413b-8ec4-c0f2a55fbf9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.707327 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.707685 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/236dd612-86c8-413b-8ec4-c0f2a55fbf9a-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.707698 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.743119 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.780535 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.794038 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.809834 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.809950 4681 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.810920 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data" (OuterVolumeSpecName: "config-data") pod "8a8db576-98ff-44c4-9c62-89332a95ad61" (UID: "8a8db576-98ff-44c4-9c62-89332a95ad61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.883291 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-kvv88"] Oct 07 17:23:32 crc kubenswrapper[4681]: I1007 17:23:32.911936 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a8db576-98ff-44c4-9c62-89332a95ad61-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.018667 4681 scope.go:117] "RemoveContainer" containerID="4e784eb8313d9dbd5e194c9377e7fee36d0baeebe3a57813fbb922bb57706e00" Oct 07 17:23:33 crc kubenswrapper[4681]: W1007 17:23:33.057287 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c6b5bc_aeb1_47bb_995f_cf7d67007900.slice/crio-aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd WatchSource:0}: Error finding container aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd: Status 404 returned error can't find the container with id aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.102239 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e420b4e3-f7b8-4d53-8b39-99ae105c3079" path="/var/lib/kubelet/pods/e420b4e3-f7b8-4d53-8b39-99ae105c3079/volumes" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.107991 4681 scope.go:117] "RemoveContainer" containerID="3fef274458acb4055c335ae5b276d774add2dad4871ff025ea707ec039377925" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.253504 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nt64g"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.301189 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8nsb"] Oct 07 17:23:33 crc kubenswrapper[4681]: W1007 17:23:33.408007 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a8142b_ea3d_4907_8331_885c973462eb.slice/crio-9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e WatchSource:0}: Error finding container 9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e: Status 404 returned error can't find the container with id 9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.437414 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nt64g" event={"ID":"85befb0e-1557-44bb-b783-f0ea67d38de9","Type":"ContainerStarted","Data":"ac7a7963f39c5b49f9030081e35e59ff5b1025617fcb0bd9781436c929a3ef1f"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.471282 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerStarted","Data":"a2ef2c60f997a9c728de5a3cb38dc728740b1786f9bd5808e689dbe3f49f3013"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.523710 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerStarted","Data":"af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.557599 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a253ef31-4d02-4fbd-8842-cf2fbe41f307","Type":"ContainerStarted","Data":"60601d142e3db665f2d87e903dcc3e805a6fbf1c6569b9b13ff5cfadd4484dc6"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.568219 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerStarted","Data":"14c1d6af19122277cbf0f32e17e6963bae818441771d97fcb9f3c9e41b9749b3"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.575589 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.586715 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kvv88" event={"ID":"13c6b5bc-aeb1-47bb-995f-cf7d67007900","Type":"ContainerStarted","Data":"aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.596379 4681 generic.go:334] "Generic (PLEG): container finished" podID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerID="47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85" exitCode=0 Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.596476 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ccnch" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.596996 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerDied","Data":"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85"} Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.602147 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.613231 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57b57fb795-6426k"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.650241 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d8b9fbb46-6wjkq"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.700392 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=8.120895048 podStartE2EDuration="28.700366066s" podCreationTimestamp="2025-10-07 17:23:05 +0000 UTC" firstStartedPulling="2025-10-07 17:23:10.274381135 +0000 UTC m=+1193.921792700" lastFinishedPulling="2025-10-07 17:23:30.853852163 +0000 UTC m=+1214.501263718" observedRunningTime="2025-10-07 17:23:33.580840477 +0000 UTC m=+1217.228252032" watchObservedRunningTime="2025-10-07 17:23:33.700366066 +0000 UTC m=+1217.347777611" Oct 07 17:23:33 crc kubenswrapper[4681]: W1007 17:23:33.744756 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07f40489_1614_45c8_864b_2288473c7c1d.slice/crio-6b0485c8ee0966c1067b4551a5af37bf0759899efeb08f97da2dc487ea5fc722 WatchSource:0}: Error finding container 6b0485c8ee0966c1067b4551a5af37bf0759899efeb08f97da2dc487ea5fc722: Status 404 returned error can't find the container with id 6b0485c8ee0966c1067b4551a5af37bf0759899efeb08f97da2dc487ea5fc722 Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.874072 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.922563 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.954753 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:33 crc kubenswrapper[4681]: E1007 17:23:33.955179 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955203 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" Oct 07 17:23:33 crc kubenswrapper[4681]: E1007 17:23:33.955229 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955237 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" Oct 07 17:23:33 crc kubenswrapper[4681]: E1007 17:23:33.955262 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236dd612-86c8-413b-8ec4-c0f2a55fbf9a" containerName="neutron-db-sync" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955274 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="236dd612-86c8-413b-8ec4-c0f2a55fbf9a" containerName="neutron-db-sync" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955512 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="236dd612-86c8-413b-8ec4-c0f2a55fbf9a" containerName="neutron-db-sync" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955549 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-httpd" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.955567 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" containerName="glance-log" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.956567 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.961673 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.962234 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.971662 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:33 crc kubenswrapper[4681]: I1007 17:23:33.993957 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.009184 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.010848 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.027097 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d869d8764-5bjtz"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.050941 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.091010 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.112938 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.112990 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jz9\" (UniqueName: \"kubernetes.io/projected/dbe731b8-1f1d-449c-accb-3cb97696d1ae-kube-api-access-l9jz9\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113023 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffn7x\" (UniqueName: \"kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113041 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-logs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113076 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113106 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113124 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113145 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113164 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113186 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113205 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113221 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113256 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.113285 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.166781 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.169515 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.174158 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.174367 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dlsjp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.174476 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.174607 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.182295 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214861 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214906 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214937 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214958 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.214977 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215003 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215020 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2h6\" (UniqueName: \"kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215044 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215091 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215116 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215174 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215206 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215235 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215287 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215324 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jz9\" (UniqueName: \"kubernetes.io/projected/dbe731b8-1f1d-449c-accb-3cb97696d1ae-kube-api-access-l9jz9\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215355 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffn7x\" (UniqueName: \"kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-logs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.215818 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.217403 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.218249 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.219374 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe731b8-1f1d-449c-accb-3cb97696d1ae-logs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.219770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.220229 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.220572 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.221640 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: W1007 17:23:34.237151 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1469d2bd_93c0_414a_951e_175bc73f377e.slice/crio-e334bd83f73eaaed8ce05da090efdd01f8b52e09cc571e916e8e9bf8fb0f0354 WatchSource:0}: Error finding container e334bd83f73eaaed8ce05da090efdd01f8b52e09cc571e916e8e9bf8fb0f0354: Status 404 returned error can't find the container with id e334bd83f73eaaed8ce05da090efdd01f8b52e09cc571e916e8e9bf8fb0f0354 Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.240630 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.244579 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.248071 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.249338 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe731b8-1f1d-449c-accb-3cb97696d1ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.260735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jz9\" (UniqueName: \"kubernetes.io/projected/dbe731b8-1f1d-449c-accb-3cb97696d1ae-kube-api-access-l9jz9\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.285569 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffn7x\" (UniqueName: \"kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x\") pod \"dnsmasq-dns-848cf88cfc-sczsk\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.325972 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.326223 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.326394 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.326495 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2h6\" (UniqueName: \"kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.326632 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.357774 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.358399 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.361251 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.361869 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.369814 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2h6\" (UniqueName: \"kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6\") pod \"neutron-67c6d485d8-ww4wp\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.423199 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.540648 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.593386 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"dbe731b8-1f1d-449c-accb-3cb97696d1ae\") " pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.627328 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.646988 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1469d2bd-93c0-414a-951e-175bc73f377e","Type":"ContainerStarted","Data":"e334bd83f73eaaed8ce05da090efdd01f8b52e09cc571e916e8e9bf8fb0f0354"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.680632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nt64g" event={"ID":"85befb0e-1557-44bb-b783-f0ea67d38de9","Type":"ContainerStarted","Data":"93ad7b183b558a32291106a0e4c44a05a04f97e8c0c7b38613fdc80e793cce88"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.711770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57b57fb795-6426k" event={"ID":"62d6d4e2-d1d4-4967-82e9-143266e1165b","Type":"ContainerStarted","Data":"c220cd8f834acbb764efa11586460f65516edc820c9b5be3a15a2ca52ba777fa"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.733563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" event={"ID":"07f40489-1614-45c8-864b-2288473c7c1d","Type":"ContainerStarted","Data":"6b0485c8ee0966c1067b4551a5af37bf0759899efeb08f97da2dc487ea5fc722"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.766132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" event={"ID":"f35c1eb1-692d-4484-a686-5ad0ce63744b","Type":"ContainerStarted","Data":"3642efca461cb9f6e97dc993d7abfb49a6bf6e9925f784f65b6edeeb16270cc6"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.771577 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nt64g" podStartSLOduration=14.77153168 podStartE2EDuration="14.77153168s" podCreationTimestamp="2025-10-07 17:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:34.734007658 +0000 UTC m=+1218.381419213" watchObservedRunningTime="2025-10-07 17:23:34.77153168 +0000 UTC m=+1218.418943245" Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.786117 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8nsb" event={"ID":"57a8142b-ea3d-4907-8331-885c973462eb","Type":"ContainerStarted","Data":"9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.831019 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerStarted","Data":"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21"} Oct 07 17:23:34 crc kubenswrapper[4681]: I1007 17:23:34.841513 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" event={"ID":"4cfda27f-d02b-4885-b681-d84af6856bfe","Type":"ContainerStarted","Data":"ba9f031d993f12b819b913d5a1f5e578638b3e472f084f72fad8cdc0d05f4d70"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.076025 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8db576-98ff-44c4-9c62-89332a95ad61" path="/var/lib/kubelet/pods/8a8db576-98ff-44c4-9c62-89332a95ad61/volumes" Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.076746 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.861939 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" event={"ID":"025296af-e542-46ae-a44e-9288982278e5","Type":"ContainerStarted","Data":"4b87779c8ebd9eb4548272913c1c728190b5d209f3b9762e47baf87f17227766"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.874289 4681 generic.go:334] "Generic (PLEG): container finished" podID="85befb0e-1557-44bb-b783-f0ea67d38de9" containerID="93ad7b183b558a32291106a0e4c44a05a04f97e8c0c7b38613fdc80e793cce88" exitCode=0 Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.874603 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nt64g" event={"ID":"85befb0e-1557-44bb-b783-f0ea67d38de9","Type":"ContainerDied","Data":"93ad7b183b558a32291106a0e4c44a05a04f97e8c0c7b38613fdc80e793cce88"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.882159 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerStarted","Data":"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.882976 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.882997 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.895768 4681 generic.go:334] "Generic (PLEG): container finished" podID="4cfda27f-d02b-4885-b681-d84af6856bfe" containerID="8596a94e8a4861b6492f77967a8f51b077e1c009b4ec8aefbafbb5fb6395ad38" exitCode=0 Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.895899 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" event={"ID":"4cfda27f-d02b-4885-b681-d84af6856bfe","Type":"ContainerDied","Data":"8596a94e8a4861b6492f77967a8f51b077e1c009b4ec8aefbafbb5fb6395ad38"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.913124 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" event={"ID":"07f40489-1614-45c8-864b-2288473c7c1d","Type":"ContainerStarted","Data":"df6ff9ed8a95cb69d8e5479cb98e338d7e3664e2fcf3ec7b9cd7ee26078e093a"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.917610 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-976bbb468-rxpr4" podStartSLOduration=9.917594014 podStartE2EDuration="9.917594014s" podCreationTimestamp="2025-10-07 17:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:35.916216956 +0000 UTC m=+1219.563628511" watchObservedRunningTime="2025-10-07 17:23:35.917594014 +0000 UTC m=+1219.565005569" Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.924090 4681 generic.go:334] "Generic (PLEG): container finished" podID="13c6b5bc-aeb1-47bb-995f-cf7d67007900" containerID="4b8c1401229f3aca69ec6a37493278b1b60a12bdf6125f4dfd244ed76351a276" exitCode=0 Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.924170 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kvv88" event={"ID":"13c6b5bc-aeb1-47bb-995f-cf7d67007900","Type":"ContainerDied","Data":"4b8c1401229f3aca69ec6a37493278b1b60a12bdf6125f4dfd244ed76351a276"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.930265 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x9lv2" event={"ID":"a53e8384-cd97-4cec-ae70-918f86112a99","Type":"ContainerStarted","Data":"650f29af75c677191b6adcab2341ac25de978d7148c459c417359d2622dc6b92"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.947675 4681 generic.go:334] "Generic (PLEG): container finished" podID="57a8142b-ea3d-4907-8331-885c973462eb" containerID="9fd06254547cd22f2f74f6da45b04403bc5efd70122ed1f1c78216e4562ec195" exitCode=0 Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.947752 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8nsb" event={"ID":"57a8142b-ea3d-4907-8331-885c973462eb","Type":"ContainerDied","Data":"9fd06254547cd22f2f74f6da45b04403bc5efd70122ed1f1c78216e4562ec195"} Oct 07 17:23:35 crc kubenswrapper[4681]: I1007 17:23:35.963647 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:23:35 crc kubenswrapper[4681]: W1007 17:23:35.967725 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf40de6a5_783a_4e65_8cfc_dd84f8652b6f.slice/crio-94551cb12b5d6ba70361ea61af8f982558a814055961036dceeca55b353add80 WatchSource:0}: Error finding container 94551cb12b5d6ba70361ea61af8f982558a814055961036dceeca55b353add80: Status 404 returned error can't find the container with id 94551cb12b5d6ba70361ea61af8f982558a814055961036dceeca55b353add80 Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.003305 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x9lv2" podStartSLOduration=8.531795306 podStartE2EDuration="1m20.003286804s" podCreationTimestamp="2025-10-07 17:22:16 +0000 UTC" firstStartedPulling="2025-10-07 17:22:17.907099869 +0000 UTC m=+1141.554511424" lastFinishedPulling="2025-10-07 17:23:29.378591367 +0000 UTC m=+1213.026002922" observedRunningTime="2025-10-07 17:23:35.977895819 +0000 UTC m=+1219.625307384" watchObservedRunningTime="2025-10-07 17:23:36.003286804 +0000 UTC m=+1219.650698359" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.522591 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.655703 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.655802 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59qk8\" (UniqueName: \"kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.655903 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.655933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.656072 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.656168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb\") pod \"4cfda27f-d02b-4885-b681-d84af6856bfe\" (UID: \"4cfda27f-d02b-4885-b681-d84af6856bfe\") " Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.696633 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8" (OuterVolumeSpecName: "kube-api-access-59qk8") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "kube-api-access-59qk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.717731 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.724564 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.747048 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config" (OuterVolumeSpecName: "config") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.778088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59qk8\" (UniqueName: \"kubernetes.io/projected/4cfda27f-d02b-4885-b681-d84af6856bfe-kube-api-access-59qk8\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.778124 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.778146 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.778155 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.792536 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.801433 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cfda27f-d02b-4885-b681-d84af6856bfe" (UID: "4cfda27f-d02b-4885-b681-d84af6856bfe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.805751 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.880128 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.880163 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfda27f-d02b-4885-b681-d84af6856bfe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.988437 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerStarted","Data":"dbbb9faedc762111f44aab2a08b7341d199126351c97bd8b447a1397e913d93a"} Oct 07 17:23:36 crc kubenswrapper[4681]: I1007 17:23:36.988761 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerStarted","Data":"94551cb12b5d6ba70361ea61af8f982558a814055961036dceeca55b353add80"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.008512 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1469d2bd-93c0-414a-951e-175bc73f377e","Type":"ContainerStarted","Data":"af44408665a3cba578e1ec45b262eed68b0d608219f24c469e1f7a9834dbde07"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.010928 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbe731b8-1f1d-449c-accb-3cb97696d1ae","Type":"ContainerStarted","Data":"64447de4fb244a692ec6f2bf1863bf48b364c236aa82de9716976a20f68d0e61"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.012665 4681 generic.go:334] "Generic (PLEG): container finished" podID="025296af-e542-46ae-a44e-9288982278e5" containerID="0883020ee2f473224f1f044a9492c3cb85984bd67c55050f33c7dd3292a5d23b" exitCode=0 Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.012711 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" event={"ID":"025296af-e542-46ae-a44e-9288982278e5","Type":"ContainerDied","Data":"0883020ee2f473224f1f044a9492c3cb85984bd67c55050f33c7dd3292a5d23b"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.079958 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" event={"ID":"4cfda27f-d02b-4885-b681-d84af6856bfe","Type":"ContainerDied","Data":"ba9f031d993f12b819b913d5a1f5e578638b3e472f084f72fad8cdc0d05f4d70"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.080006 4681 scope.go:117] "RemoveContainer" containerID="8596a94e8a4861b6492f77967a8f51b077e1c009b4ec8aefbafbb5fb6395ad38" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.080054 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-g59pm" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.183797 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" event={"ID":"07f40489-1614-45c8-864b-2288473c7c1d","Type":"ContainerStarted","Data":"8c75a05ea950a56fb60ad1596ac223509631a6e41fbe0f3611e44ac961b0cc66"} Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.183836 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.183859 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.367698 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podStartSLOduration=9.367677121 podStartE2EDuration="9.367677121s" podCreationTimestamp="2025-10-07 17:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:37.347779289 +0000 UTC m=+1220.995190844" watchObservedRunningTime="2025-10-07 17:23:37.367677121 +0000 UTC m=+1221.015088676" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.438935 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.440474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.442239 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.461597 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-g59pm"] Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.628219 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.628583 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.782342 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b94d78545-dfdgb"] Oct 07 17:23:37 crc kubenswrapper[4681]: E1007 17:23:37.782731 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfda27f-d02b-4885-b681-d84af6856bfe" containerName="init" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.782742 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfda27f-d02b-4885-b681-d84af6856bfe" containerName="init" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.782950 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfda27f-d02b-4885-b681-d84af6856bfe" containerName="init" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.783855 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.796967 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.797140 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807118 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-public-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807183 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69g2k\" (UniqueName: \"kubernetes.io/projected/c77522e8-d403-4227-9740-21dca2843c58-kube-api-access-69g2k\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807262 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807282 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-httpd-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807297 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-combined-ca-bundle\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807316 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-ovndb-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.807359 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-internal-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.810684 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b94d78545-dfdgb"] Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.913918 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-public-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914224 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69g2k\" (UniqueName: \"kubernetes.io/projected/c77522e8-d403-4227-9740-21dca2843c58-kube-api-access-69g2k\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914321 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914338 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-httpd-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914357 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-combined-ca-bundle\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914376 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-ovndb-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.914424 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-internal-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.963238 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-internal-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.964769 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-httpd-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.972695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-config\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:37 crc kubenswrapper[4681]: I1007 17:23:37.988431 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-public-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.001770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-ovndb-tls-certs\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.004316 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77522e8-d403-4227-9740-21dca2843c58-combined-ca-bundle\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.010446 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69g2k\" (UniqueName: \"kubernetes.io/projected/c77522e8-d403-4227-9740-21dca2843c58-kube-api-access-69g2k\") pod \"neutron-5b94d78545-dfdgb\" (UID: \"c77522e8-d403-4227-9740-21dca2843c58\") " pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.128095 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.208525 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerStarted","Data":"dd8f65c730c61845fe20219f2d0e5fe26b5bbf646d4b0ae5bc18d711ed0d103c"} Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.209593 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.239154 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67c6d485d8-ww4wp" podStartSLOduration=4.23913872 podStartE2EDuration="4.23913872s" podCreationTimestamp="2025-10-07 17:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:38.234309776 +0000 UTC m=+1221.881721331" watchObservedRunningTime="2025-10-07 17:23:38.23913872 +0000 UTC m=+1221.886550275" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.772758 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.785824 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.798487 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.859427 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb56k\" (UniqueName: \"kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k\") pod \"85befb0e-1557-44bb-b783-f0ea67d38de9\" (UID: \"85befb0e-1557-44bb-b783-f0ea67d38de9\") " Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.859511 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqkk2\" (UniqueName: \"kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2\") pod \"57a8142b-ea3d-4907-8331-885c973462eb\" (UID: \"57a8142b-ea3d-4907-8331-885c973462eb\") " Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.859666 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmcp\" (UniqueName: \"kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp\") pod \"13c6b5bc-aeb1-47bb-995f-cf7d67007900\" (UID: \"13c6b5bc-aeb1-47bb-995f-cf7d67007900\") " Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.867092 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp" (OuterVolumeSpecName: "kube-api-access-8tmcp") pod "13c6b5bc-aeb1-47bb-995f-cf7d67007900" (UID: "13c6b5bc-aeb1-47bb-995f-cf7d67007900"). InnerVolumeSpecName "kube-api-access-8tmcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.871710 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k" (OuterVolumeSpecName: "kube-api-access-gb56k") pod "85befb0e-1557-44bb-b783-f0ea67d38de9" (UID: "85befb0e-1557-44bb-b783-f0ea67d38de9"). InnerVolumeSpecName "kube-api-access-gb56k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.886847 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2" (OuterVolumeSpecName: "kube-api-access-dqkk2") pod "57a8142b-ea3d-4907-8331-885c973462eb" (UID: "57a8142b-ea3d-4907-8331-885c973462eb"). InnerVolumeSpecName "kube-api-access-dqkk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.964157 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmcp\" (UniqueName: \"kubernetes.io/projected/13c6b5bc-aeb1-47bb-995f-cf7d67007900-kube-api-access-8tmcp\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.964192 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb56k\" (UniqueName: \"kubernetes.io/projected/85befb0e-1557-44bb-b783-f0ea67d38de9-kube-api-access-gb56k\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:38 crc kubenswrapper[4681]: I1007 17:23:38.964204 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqkk2\" (UniqueName: \"kubernetes.io/projected/57a8142b-ea3d-4907-8331-885c973462eb-kube-api-access-dqkk2\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.039924 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfda27f-d02b-4885-b681-d84af6856bfe" path="/var/lib/kubelet/pods/4cfda27f-d02b-4885-b681-d84af6856bfe/volumes" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.231919 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-kvv88" event={"ID":"13c6b5bc-aeb1-47bb-995f-cf7d67007900","Type":"ContainerDied","Data":"aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.231946 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-kvv88" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.231956 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae5c9b522a3af327d3a6f004afcaa76f686cac56c3eae1ba95df121c5f6fbbd" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.234454 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1469d2bd-93c0-414a-951e-175bc73f377e","Type":"ContainerStarted","Data":"53a75dfa39b03730aff54d2c1f4794f76b0674671b357474a0c580ab9e417c7b"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.239044 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbe731b8-1f1d-449c-accb-3cb97696d1ae","Type":"ContainerStarted","Data":"a039d4d041d8ece14e748db0b6c445ceb1899a4ecfb32c31bc1be901c77ed955"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.242101 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8nsb" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.242102 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8nsb" event={"ID":"57a8142b-ea3d-4907-8331-885c973462eb","Type":"ContainerDied","Data":"9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.242413 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1453d294f2fac10db287df17c53e809357c08a15bb80efc20a4edc4c3d763e" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.244703 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" event={"ID":"025296af-e542-46ae-a44e-9288982278e5","Type":"ContainerStarted","Data":"b27cfe1a4a0bb7f071077c53c8144bc8a034567d2bdf6b1bf2ac69acd1a9b777"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.247719 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nt64g" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.248136 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nt64g" event={"ID":"85befb0e-1557-44bb-b783-f0ea67d38de9","Type":"ContainerDied","Data":"ac7a7963f39c5b49f9030081e35e59ff5b1025617fcb0bd9781436c929a3ef1f"} Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.248158 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7a7963f39c5b49f9030081e35e59ff5b1025617fcb0bd9781436c929a3ef1f" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.259332 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.259318869 podStartE2EDuration="8.259318869s" podCreationTimestamp="2025-10-07 17:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:39.255002269 +0000 UTC m=+1222.902413824" watchObservedRunningTime="2025-10-07 17:23:39.259318869 +0000 UTC m=+1222.906730424" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.289154 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" podStartSLOduration=6.289139197 podStartE2EDuration="6.289139197s" podCreationTimestamp="2025-10-07 17:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:39.286043151 +0000 UTC m=+1222.933454706" watchObservedRunningTime="2025-10-07 17:23:39.289139197 +0000 UTC m=+1222.936550752" Oct 07 17:23:39 crc kubenswrapper[4681]: I1007 17:23:39.429112 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:40 crc kubenswrapper[4681]: I1007 17:23:40.115705 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b94d78545-dfdgb"] Oct 07 17:23:40 crc kubenswrapper[4681]: W1007 17:23:40.146643 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77522e8_d403_4227_9740_21dca2843c58.slice/crio-8cc75a996d895e101018ea31b62fd4f60c8a529e08953845bf687d93fbba26d5 WatchSource:0}: Error finding container 8cc75a996d895e101018ea31b62fd4f60c8a529e08953845bf687d93fbba26d5: Status 404 returned error can't find the container with id 8cc75a996d895e101018ea31b62fd4f60c8a529e08953845bf687d93fbba26d5 Oct 07 17:23:40 crc kubenswrapper[4681]: I1007 17:23:40.266753 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57b57fb795-6426k" event={"ID":"62d6d4e2-d1d4-4967-82e9-143266e1165b","Type":"ContainerStarted","Data":"946b2c49d4b3d3aab137c7b56a8732e12e2ed7acaa45f05b4acb58219af0026a"} Oct 07 17:23:40 crc kubenswrapper[4681]: I1007 17:23:40.268189 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" event={"ID":"f35c1eb1-692d-4484-a686-5ad0ce63744b","Type":"ContainerStarted","Data":"dde15c91963dcb29a1d9e810ba25d7a48163f9c89ec46331f4506bc71ea9f01e"} Oct 07 17:23:40 crc kubenswrapper[4681]: I1007 17:23:40.270004 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b94d78545-dfdgb" event={"ID":"c77522e8-d403-4227-9740-21dca2843c58","Type":"ContainerStarted","Data":"8cc75a996d895e101018ea31b62fd4f60c8a529e08953845bf687d93fbba26d5"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.052527 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.149704 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.149791 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcr7v\" (UniqueName: \"kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.149829 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.149935 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.150025 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.150059 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.150078 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd\") pod \"4673f09e-2140-4dc5-ac9d-af616ddba08d\" (UID: \"4673f09e-2140-4dc5-ac9d-af616ddba08d\") " Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.150494 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.150607 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.151501 4681 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.151522 4681 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4673f09e-2140-4dc5-ac9d-af616ddba08d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.172058 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.175157 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts" (OuterVolumeSpecName: "scripts") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.177693 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v" (OuterVolumeSpecName: "kube-api-access-hcr7v") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "kube-api-access-hcr7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.229193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data" (OuterVolumeSpecName: "config-data") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.240609 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4673f09e-2140-4dc5-ac9d-af616ddba08d" (UID: "4673f09e-2140-4dc5-ac9d-af616ddba08d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.253035 4681 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.253073 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcr7v\" (UniqueName: \"kubernetes.io/projected/4673f09e-2140-4dc5-ac9d-af616ddba08d-kube-api-access-hcr7v\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.253085 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.253093 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.253101 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4673f09e-2140-4dc5-ac9d-af616ddba08d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.280301 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b94d78545-dfdgb" event={"ID":"c77522e8-d403-4227-9740-21dca2843c58","Type":"ContainerStarted","Data":"43c1b1b1e0ad979d4009d7a49cf8ea291726a81d365239a9bc5077f196fafa8b"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.280919 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.280939 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b94d78545-dfdgb" event={"ID":"c77522e8-d403-4227-9740-21dca2843c58","Type":"ContainerStarted","Data":"42dc068649e2605ac81b93b27edf98846342bdf4c9d5288df2db66999427e4a1"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.281936 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dbe731b8-1f1d-449c-accb-3cb97696d1ae","Type":"ContainerStarted","Data":"0067ed2d1def0dc793db180840e468d911bb6d5740f6a96c9c443d1b7a97fd8a"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.283753 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57b57fb795-6426k" event={"ID":"62d6d4e2-d1d4-4967-82e9-143266e1165b","Type":"ContainerStarted","Data":"5450e66bf2aa148a8fdd4a78deac2e6ba7040b042b63d2e0237366b73a43ee53"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.285141 4681 generic.go:334] "Generic (PLEG): container finished" podID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerID="b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c" exitCode=0 Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.285203 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerDied","Data":"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.285220 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4673f09e-2140-4dc5-ac9d-af616ddba08d","Type":"ContainerDied","Data":"63fbeffa619773135c369dc51c075283bb13b6d0d75273069f17c11e7701f65e"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.285236 4681 scope.go:117] "RemoveContainer" containerID="b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.285332 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.289155 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" event={"ID":"f35c1eb1-692d-4484-a686-5ad0ce63744b","Type":"ContainerStarted","Data":"6e916f37024ab8d2db3c1288f50dc76c7971975077824ab04f6bb1699f1e02d0"} Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.300086 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b94d78545-dfdgb" podStartSLOduration=4.300068628 podStartE2EDuration="4.300068628s" podCreationTimestamp="2025-10-07 17:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:41.297019323 +0000 UTC m=+1224.944430878" watchObservedRunningTime="2025-10-07 17:23:41.300068628 +0000 UTC m=+1224.947480183" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.313130 4681 scope.go:117] "RemoveContainer" containerID="47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.337131 4681 scope.go:117] "RemoveContainer" containerID="b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.339232 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57b57fb795-6426k" podStartSLOduration=10.624943138 podStartE2EDuration="16.339213245s" podCreationTimestamp="2025-10-07 17:23:25 +0000 UTC" firstStartedPulling="2025-10-07 17:23:33.743101172 +0000 UTC m=+1217.390512727" lastFinishedPulling="2025-10-07 17:23:39.457371279 +0000 UTC m=+1223.104782834" observedRunningTime="2025-10-07 17:23:41.332929869 +0000 UTC m=+1224.980341414" watchObservedRunningTime="2025-10-07 17:23:41.339213245 +0000 UTC m=+1224.986624800" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.340216 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c\": container with ID starting with b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c not found: ID does not exist" containerID="b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.340270 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c"} err="failed to get container status \"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c\": rpc error: code = NotFound desc = could not find container \"b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c\": container with ID starting with b2b40807d3dc738f5256b38322f67bc4941ad7e844bfc7a037b6434a95bfd39c not found: ID does not exist" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.340299 4681 scope.go:117] "RemoveContainer" containerID="47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.340780 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85\": container with ID starting with 47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85 not found: ID does not exist" containerID="47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.340817 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85"} err="failed to get container status \"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85\": rpc error: code = NotFound desc = could not find container \"47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85\": container with ID starting with 47933adc22c72093df0bb815109463ebebb9ddee791709ddfd8670e815a64f85 not found: ID does not exist" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.364039 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.364021362999999 podStartE2EDuration="8.364021363s" podCreationTimestamp="2025-10-07 17:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:41.360205327 +0000 UTC m=+1225.007616882" watchObservedRunningTime="2025-10-07 17:23:41.364021363 +0000 UTC m=+1225.011432918" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.391957 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d869d8764-5bjtz" podStartSLOduration=10.896521839 podStartE2EDuration="16.391937069s" podCreationTimestamp="2025-10-07 17:23:25 +0000 UTC" firstStartedPulling="2025-10-07 17:23:33.961919378 +0000 UTC m=+1217.609330933" lastFinishedPulling="2025-10-07 17:23:39.457334608 +0000 UTC m=+1223.104746163" observedRunningTime="2025-10-07 17:23:41.386358104 +0000 UTC m=+1225.033769659" watchObservedRunningTime="2025-10-07 17:23:41.391937069 +0000 UTC m=+1225.039348624" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.445807 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.448439 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.474847 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.475298 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-notification-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475321 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-notification-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.475359 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85befb0e-1557-44bb-b783-f0ea67d38de9" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475366 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="85befb0e-1557-44bb-b783-f0ea67d38de9" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.475375 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a8142b-ea3d-4907-8331-885c973462eb" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475381 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a8142b-ea3d-4907-8331-885c973462eb" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.475390 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c6b5bc-aeb1-47bb-995f-cf7d67007900" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475396 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c6b5bc-aeb1-47bb-995f-cf7d67007900" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: E1007 17:23:41.475405 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-central-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475411 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-central-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475575 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c6b5bc-aeb1-47bb-995f-cf7d67007900" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475589 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-central-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475598 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a8142b-ea3d-4907-8331-885c973462eb" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475613 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="85befb0e-1557-44bb-b783-f0ea67d38de9" containerName="mariadb-database-create" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.475622 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" containerName="ceilometer-notification-agent" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.477171 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.480235 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.480450 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.498967 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560562 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560633 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560660 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560752 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nd6m\" (UniqueName: \"kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560779 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.560807 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.561232 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.664844 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nd6m\" (UniqueName: \"kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.664935 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.664957 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.665028 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.665056 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.665079 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.665094 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.665513 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.666188 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.675523 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.676553 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.679276 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.685865 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.706039 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nd6m\" (UniqueName: \"kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m\") pod \"ceilometer-0\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.748773 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.748813 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.794361 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.922620 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:41 crc kubenswrapper[4681]: I1007 17:23:41.978396 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:42 crc kubenswrapper[4681]: I1007 17:23:42.346285 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:42 crc kubenswrapper[4681]: I1007 17:23:42.346457 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:42 crc kubenswrapper[4681]: I1007 17:23:42.445997 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:23:42 crc kubenswrapper[4681]: W1007 17:23:42.450568 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod932e68dd_1e76_4bf3_8fe6_4d34de164e74.slice/crio-09f90350129fbcfdcdc9cd3cabcb142e2ea3ee7883236ed857ecb4396d580d42 WatchSource:0}: Error finding container 09f90350129fbcfdcdc9cd3cabcb142e2ea3ee7883236ed857ecb4396d580d42: Status 404 returned error can't find the container with id 09f90350129fbcfdcdc9cd3cabcb142e2ea3ee7883236ed857ecb4396d580d42 Oct 07 17:23:43 crc kubenswrapper[4681]: I1007 17:23:43.039652 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4673f09e-2140-4dc5-ac9d-af616ddba08d" path="/var/lib/kubelet/pods/4673f09e-2140-4dc5-ac9d-af616ddba08d/volumes" Oct 07 17:23:43 crc kubenswrapper[4681]: I1007 17:23:43.353725 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerStarted","Data":"09f90350129fbcfdcdc9cd3cabcb142e2ea3ee7883236ed857ecb4396d580d42"} Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.362524 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.429113 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.512603 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.512830 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="dnsmasq-dns" containerID="cri-o://1b2775109d58b27cc1e8dba7d1e0d90a73552d7d8cb8527b108f90e4330d5a45" gracePeriod=10 Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.633487 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.634998 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.689570 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 17:23:44 crc kubenswrapper[4681]: I1007 17:23:44.715256 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.383315 4681 generic.go:334] "Generic (PLEG): container finished" podID="a224f056-2957-405d-a927-4a1b24b01979" containerID="1b2775109d58b27cc1e8dba7d1e0d90a73552d7d8cb8527b108f90e4330d5a45" exitCode=0 Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.391608 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" event={"ID":"a224f056-2957-405d-a927-4a1b24b01979","Type":"ContainerDied","Data":"1b2775109d58b27cc1e8dba7d1e0d90a73552d7d8cb8527b108f90e4330d5a45"} Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.391654 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" event={"ID":"a224f056-2957-405d-a927-4a1b24b01979","Type":"ContainerDied","Data":"5b2b7510d06476133be26e9ce796eb82eb0d5d96bdf5fa2ed96121f2cc9c7ca9"} Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.391666 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2b7510d06476133be26e9ce796eb82eb0d5d96bdf5fa2ed96121f2cc9c7ca9" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.391965 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.391985 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.406660 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.534493 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.535194 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.535319 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhmcx\" (UniqueName: \"kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.535447 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.535624 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.535746 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.561135 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx" (OuterVolumeSpecName: "kube-api-access-bhmcx") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "kube-api-access-bhmcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.598354 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.641599 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.641931 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.642359 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.642383 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhmcx\" (UniqueName: \"kubernetes.io/projected/a224f056-2957-405d-a927-4a1b24b01979-kube-api-access-bhmcx\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:45 crc kubenswrapper[4681]: W1007 17:23:45.642494 4681 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a224f056-2957-405d-a927-4a1b24b01979/volumes/kubernetes.io~configmap/ovsdbserver-sb Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.642509 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.668297 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config" (OuterVolumeSpecName: "config") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: E1007 17:23:45.675293 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0 podName:a224f056-2957-405d-a927-4a1b24b01979 nodeName:}" failed. No retries permitted until 2025-10-07 17:23:46.1752633 +0000 UTC m=+1229.822674855 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979") : error deleting /var/lib/kubelet/pods/a224f056-2957-405d-a927-4a1b24b01979/volume-subpaths: remove /var/lib/kubelet/pods/a224f056-2957-405d-a927-4a1b24b01979/volume-subpaths: no such file or directory Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.675734 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.743956 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.743991 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:45 crc kubenswrapper[4681]: I1007 17:23:45.744001 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.203101 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.203718 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.253159 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") pod \"a224f056-2957-405d-a927-4a1b24b01979\" (UID: \"a224f056-2957-405d-a927-4a1b24b01979\") " Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.253589 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a224f056-2957-405d-a927-4a1b24b01979" (UID: "a224f056-2957-405d-a927-4a1b24b01979"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.253953 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a224f056-2957-405d-a927-4a1b24b01979-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.393036 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerStarted","Data":"541b74afc145f6f164980d4956e23d08a3fe408daa562e5e211972ea8f36f713"} Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.393301 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-49r7s" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.421370 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.432795 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-49r7s"] Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.782108 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.865079 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.865515 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:46 crc kubenswrapper[4681]: I1007 17:23:46.865498 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:47 crc kubenswrapper[4681]: I1007 17:23:47.038733 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a224f056-2957-405d-a927-4a1b24b01979" path="/var/lib/kubelet/pods/a224f056-2957-405d-a927-4a1b24b01979/volumes" Oct 07 17:23:47 crc kubenswrapper[4681]: I1007 17:23:47.401023 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:47 crc kubenswrapper[4681]: I1007 17:23:47.401056 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:47 crc kubenswrapper[4681]: I1007 17:23:47.442461 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:23:47 crc kubenswrapper[4681]: I1007 17:23:47.621182 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:23:48 crc kubenswrapper[4681]: I1007 17:23:48.410750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerStarted","Data":"1c4b49320b5dd5267cffbb1b0a3e5df00a7564f7bbd6046738f059f604d93c65"} Oct 07 17:23:48 crc kubenswrapper[4681]: I1007 17:23:48.644131 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:48 crc kubenswrapper[4681]: I1007 17:23:48.644736 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:49 crc kubenswrapper[4681]: I1007 17:23:49.197022 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:49 crc kubenswrapper[4681]: I1007 17:23:49.197057 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:49 crc kubenswrapper[4681]: I1007 17:23:49.423670 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerStarted","Data":"abdf8223b4adb020702f680ededf76a5d25ca568e07de8c6214d1a24ac0a59e5"} Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.130143 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-afe2-account-create-vd26v"] Oct 07 17:23:50 crc kubenswrapper[4681]: E1007 17:23:50.135636 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="dnsmasq-dns" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.135664 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="dnsmasq-dns" Oct 07 17:23:50 crc kubenswrapper[4681]: E1007 17:23:50.135692 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="init" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.135700 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="init" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.137933 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a224f056-2957-405d-a927-4a1b24b01979" containerName="dnsmasq-dns" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.138525 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.155307 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.165390 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-afe2-account-create-vd26v"] Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.232798 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6mb\" (UniqueName: \"kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb\") pod \"nova-api-afe2-account-create-vd26v\" (UID: \"19fb8b33-cbe4-46dd-83b0-d35325b63940\") " pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.336763 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6mb\" (UniqueName: \"kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb\") pod \"nova-api-afe2-account-create-vd26v\" (UID: \"19fb8b33-cbe4-46dd-83b0-d35325b63940\") " pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.343237 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-03d8-account-create-8f5sj"] Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.349051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.356979 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.392830 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-03d8-account-create-8f5sj"] Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.395378 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6mb\" (UniqueName: \"kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb\") pod \"nova-api-afe2-account-create-vd26v\" (UID: \"19fb8b33-cbe4-46dd-83b0-d35325b63940\") " pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.442615 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fln\" (UniqueName: \"kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln\") pod \"nova-cell0-03d8-account-create-8f5sj\" (UID: \"682a62d1-b6b0-4f0d-9f94-ba1f48d92447\") " pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.443166 4681 generic.go:334] "Generic (PLEG): container finished" podID="a53e8384-cd97-4cec-ae70-918f86112a99" containerID="650f29af75c677191b6adcab2341ac25de978d7148c459c417359d2622dc6b92" exitCode=0 Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.443206 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x9lv2" event={"ID":"a53e8384-cd97-4cec-ae70-918f86112a99","Type":"ContainerDied","Data":"650f29af75c677191b6adcab2341ac25de978d7148c459c417359d2622dc6b92"} Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.542213 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1c2d-account-create-qp7z8"] Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.543443 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.545553 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fln\" (UniqueName: \"kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln\") pod \"nova-cell0-03d8-account-create-8f5sj\" (UID: \"682a62d1-b6b0-4f0d-9f94-ba1f48d92447\") " pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.548748 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.555301 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1c2d-account-create-qp7z8"] Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.565793 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fln\" (UniqueName: \"kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln\") pod \"nova-cell0-03d8-account-create-8f5sj\" (UID: \"682a62d1-b6b0-4f0d-9f94-ba1f48d92447\") " pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.601273 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.647075 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjpk\" (UniqueName: \"kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk\") pod \"nova-cell1-1c2d-account-create-qp7z8\" (UID: \"f39303f5-20d2-4d09-8033-70a8c3ad916b\") " pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.747956 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.749417 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjpk\" (UniqueName: \"kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk\") pod \"nova-cell1-1c2d-account-create-qp7z8\" (UID: \"f39303f5-20d2-4d09-8033-70a8c3ad916b\") " pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.774484 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.776212 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjpk\" (UniqueName: \"kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk\") pod \"nova-cell1-1c2d-account-create-qp7z8\" (UID: \"f39303f5-20d2-4d09-8033-70a8c3ad916b\") " pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:50 crc kubenswrapper[4681]: I1007 17:23:50.875345 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.219160 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.454466 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerStarted","Data":"0b7e194c466bfda2a65b9469af21a010c6ca66dfe818fadf0281802b1a8bd81c"} Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.454702 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.472048 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-afe2-account-create-vd26v"] Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.489720 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.814469721 podStartE2EDuration="10.489698661s" podCreationTimestamp="2025-10-07 17:23:41 +0000 UTC" firstStartedPulling="2025-10-07 17:23:42.452826367 +0000 UTC m=+1226.100237922" lastFinishedPulling="2025-10-07 17:23:50.128055307 +0000 UTC m=+1233.775466862" observedRunningTime="2025-10-07 17:23:51.480483964 +0000 UTC m=+1235.127895519" watchObservedRunningTime="2025-10-07 17:23:51.489698661 +0000 UTC m=+1235.137110216" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.607034 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-03d8-account-create-8f5sj"] Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.640526 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1c2d-account-create-qp7z8"] Oct 07 17:23:51 crc kubenswrapper[4681]: W1007 17:23:51.650213 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod682a62d1_b6b0_4f0d_9f94_ba1f48d92447.slice/crio-0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0 WatchSource:0}: Error finding container 0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0: Status 404 returned error can't find the container with id 0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0 Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.824068 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.844132 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.953397 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.954044 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:51 crc kubenswrapper[4681]: I1007 17:23:51.974716 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.016179 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129443 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129532 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129592 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qftrr\" (UniqueName: \"kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129649 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129723 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.129760 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data\") pod \"a53e8384-cd97-4cec-ae70-918f86112a99\" (UID: \"a53e8384-cd97-4cec-ae70-918f86112a99\") " Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.130642 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.144138 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.175057 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr" (OuterVolumeSpecName: "kube-api-access-qftrr") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "kube-api-access-qftrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.176115 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts" (OuterVolumeSpecName: "scripts") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.227084 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.232278 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.232323 4681 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.232332 4681 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a53e8384-cd97-4cec-ae70-918f86112a99-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.232340 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.232349 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qftrr\" (UniqueName: \"kubernetes.io/projected/a53e8384-cd97-4cec-ae70-918f86112a99-kube-api-access-qftrr\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.342699 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data" (OuterVolumeSpecName: "config-data") pod "a53e8384-cd97-4cec-ae70-918f86112a99" (UID: "a53e8384-cd97-4cec-ae70-918f86112a99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.381336 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.381445 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.383344 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.447703 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a53e8384-cd97-4cec-ae70-918f86112a99-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.479169 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x9lv2" event={"ID":"a53e8384-cd97-4cec-ae70-918f86112a99","Type":"ContainerDied","Data":"01f1a0b3e7aa451a2183604f240199c72451313a062ce872059da116669803c3"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.479206 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f1a0b3e7aa451a2183604f240199c72451313a062ce872059da116669803c3" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.479262 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x9lv2" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.498019 4681 generic.go:334] "Generic (PLEG): container finished" podID="f39303f5-20d2-4d09-8033-70a8c3ad916b" containerID="249da69380c06575044c5f8afa96025a6444ff4ecaf86de16d1de2e38b09aefe" exitCode=0 Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.498107 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" event={"ID":"f39303f5-20d2-4d09-8033-70a8c3ad916b","Type":"ContainerDied","Data":"249da69380c06575044c5f8afa96025a6444ff4ecaf86de16d1de2e38b09aefe"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.498139 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" event={"ID":"f39303f5-20d2-4d09-8033-70a8c3ad916b","Type":"ContainerStarted","Data":"2c5ca2e4ce46d26dc54c9ee54de2ffadc09d1f754fb3e1a73b920927c5448d59"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.508143 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03d8-account-create-8f5sj" event={"ID":"682a62d1-b6b0-4f0d-9f94-ba1f48d92447","Type":"ContainerStarted","Data":"5ad82b80c83d78e8f5d52a2d764c63675763c4c4c5cdbf323044015855117723"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.508190 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03d8-account-create-8f5sj" event={"ID":"682a62d1-b6b0-4f0d-9f94-ba1f48d92447","Type":"ContainerStarted","Data":"0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.520607 4681 generic.go:334] "Generic (PLEG): container finished" podID="19fb8b33-cbe4-46dd-83b0-d35325b63940" containerID="6f5537b199e5b8bc513252240eec6cd21166d8c2bbe5e17d3d585a9fb98a93a7" exitCode=0 Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.521426 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-afe2-account-create-vd26v" event={"ID":"19fb8b33-cbe4-46dd-83b0-d35325b63940","Type":"ContainerDied","Data":"6f5537b199e5b8bc513252240eec6cd21166d8c2bbe5e17d3d585a9fb98a93a7"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.521449 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-afe2-account-create-vd26v" event={"ID":"19fb8b33-cbe4-46dd-83b0-d35325b63940","Type":"ContainerStarted","Data":"69658166c895516cbbce00aa7da1aaf9dea46cf435d9f2663e15a26b45b0a4c0"} Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.624510 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-03d8-account-create-8f5sj" podStartSLOduration=2.624492794 podStartE2EDuration="2.624492794s" podCreationTimestamp="2025-10-07 17:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:52.54712532 +0000 UTC m=+1236.194536875" watchObservedRunningTime="2025-10-07 17:23:52.624492794 +0000 UTC m=+1236.271904349" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.787591 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:23:52 crc kubenswrapper[4681]: E1007 17:23:52.788100 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" containerName="cinder-db-sync" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.788116 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" containerName="cinder-db-sync" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.788391 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" containerName="cinder-db-sync" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.789549 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.801148 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.801308 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.801396 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tnpq2" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.809285 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.851492 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.867729 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.867805 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.867828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.867861 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.867948 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.868020 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrgp\" (UniqueName: \"kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.900953 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.902487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.908100 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.976943 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrgp\" (UniqueName: \"kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977239 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977361 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977476 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977577 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977680 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977807 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbl7\" (UniqueName: \"kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.977937 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.978107 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.978213 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.978325 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.978417 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.988503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.991629 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:52 crc kubenswrapper[4681]: I1007 17:23:52.992667 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.007445 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.008522 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.021449 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrgp\" (UniqueName: \"kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp\") pod \"cinder-scheduler-0\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " pod="openstack/cinder-scheduler-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.076805 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.078335 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.078555 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.084237 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.085740 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.086280 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.087163 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.087171 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.087430 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbl7\" (UniqueName: \"kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.088160 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.088272 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.088394 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.090213 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.095093 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.097348 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.155665 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.174737 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbl7\" (UniqueName: \"kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7\") pod \"dnsmasq-dns-6578955fd5-txwrt\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.189923 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190016 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190035 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190076 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190092 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7vm\" (UniqueName: \"kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190118 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.190150 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.261495 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291604 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291684 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291747 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291764 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291797 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291813 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7vm\" (UniqueName: \"kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.291840 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.295704 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.296129 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.302084 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.303227 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.307026 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.313839 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.350127 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7vm\" (UniqueName: \"kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm\") pod \"cinder-api-0\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.410364 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.561990 4681 generic.go:334] "Generic (PLEG): container finished" podID="682a62d1-b6b0-4f0d-9f94-ba1f48d92447" containerID="5ad82b80c83d78e8f5d52a2d764c63675763c4c4c5cdbf323044015855117723" exitCode=0 Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.562421 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03d8-account-create-8f5sj" event={"ID":"682a62d1-b6b0-4f0d-9f94-ba1f48d92447","Type":"ContainerDied","Data":"5ad82b80c83d78e8f5d52a2d764c63675763c4c4c5cdbf323044015855117723"} Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.669326 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:53 crc kubenswrapper[4681]: I1007 17:23:53.989925 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.227890 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" podUID="07f40489-1614-45c8-864b-2288473c7c1d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.255199 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d8b9fbb46-6wjkq" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.365233 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.365684 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" containerID="cri-o://553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21" gracePeriod=30 Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.365951 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" containerID="cri-o://38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3" gracePeriod=30 Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.389318 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.434658 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxjpk\" (UniqueName: \"kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk\") pod \"f39303f5-20d2-4d09-8033-70a8c3ad916b\" (UID: \"f39303f5-20d2-4d09-8033-70a8c3ad916b\") " Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.442381 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk" (OuterVolumeSpecName: "kube-api-access-nxjpk") pod "f39303f5-20d2-4d09-8033-70a8c3ad916b" (UID: "f39303f5-20d2-4d09-8033-70a8c3ad916b"). InnerVolumeSpecName "kube-api-access-nxjpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.468867 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.540162 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxjpk\" (UniqueName: \"kubernetes.io/projected/f39303f5-20d2-4d09-8033-70a8c3ad916b-kube-api-access-nxjpk\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.575279 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.582534 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerStarted","Data":"2403faf2f3f955e0b4e445e07296a6e4c55e90a390611151e2fd447d40f540a0"} Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.584581 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-afe2-account-create-vd26v" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.584687 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-afe2-account-create-vd26v" event={"ID":"19fb8b33-cbe4-46dd-83b0-d35325b63940","Type":"ContainerDied","Data":"69658166c895516cbbce00aa7da1aaf9dea46cf435d9f2663e15a26b45b0a4c0"} Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.584712 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69658166c895516cbbce00aa7da1aaf9dea46cf435d9f2663e15a26b45b0a4c0" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.643088 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6mb\" (UniqueName: \"kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb\") pod \"19fb8b33-cbe4-46dd-83b0-d35325b63940\" (UID: \"19fb8b33-cbe4-46dd-83b0-d35325b63940\") " Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.644317 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" event={"ID":"f39303f5-20d2-4d09-8033-70a8c3ad916b","Type":"ContainerDied","Data":"2c5ca2e4ce46d26dc54c9ee54de2ffadc09d1f754fb3e1a73b920927c5448d59"} Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.644355 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c5ca2e4ce46d26dc54c9ee54de2ffadc09d1f754fb3e1a73b920927c5448d59" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.644416 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1c2d-account-create-qp7z8" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.653318 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb" (OuterVolumeSpecName: "kube-api-access-9q6mb") pod "19fb8b33-cbe4-46dd-83b0-d35325b63940" (UID: "19fb8b33-cbe4-46dd-83b0-d35325b63940"). InnerVolumeSpecName "kube-api-access-9q6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.654059 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerStarted","Data":"d0d019e3322dc416a62b0b0eae3deba3b2dfbc58229e84a4a77b554f459396c1"} Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.747568 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6mb\" (UniqueName: \"kubernetes.io/projected/19fb8b33-cbe4-46dd-83b0-d35325b63940-kube-api-access-9q6mb\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:54 crc kubenswrapper[4681]: I1007 17:23:54.785712 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.675235 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.692822 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-03d8-account-create-8f5sj" event={"ID":"682a62d1-b6b0-4f0d-9f94-ba1f48d92447","Type":"ContainerDied","Data":"0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0"} Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.692859 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0262eab1ed0fcb63dca42deb43470c5a76ad8bcaa893f7aabe2ffb549d5287c0" Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.692923 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-03d8-account-create-8f5sj" Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.711243 4681 generic.go:334] "Generic (PLEG): container finished" podID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerID="553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21" exitCode=143 Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.711311 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerDied","Data":"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21"} Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.713960 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" event={"ID":"01926d51-8e89-44e0-8032-7a701b7fcb92","Type":"ContainerStarted","Data":"f339039a340a591bbed2cfee53a5ab906b4e2e8d7f52b91c225648e496f60e35"} Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.775258 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5fln\" (UniqueName: \"kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln\") pod \"682a62d1-b6b0-4f0d-9f94-ba1f48d92447\" (UID: \"682a62d1-b6b0-4f0d-9f94-ba1f48d92447\") " Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.786579 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln" (OuterVolumeSpecName: "kube-api-access-w5fln") pod "682a62d1-b6b0-4f0d-9f94-ba1f48d92447" (UID: "682a62d1-b6b0-4f0d-9f94-ba1f48d92447"). InnerVolumeSpecName "kube-api-access-w5fln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:55 crc kubenswrapper[4681]: I1007 17:23:55.877315 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5fln\" (UniqueName: \"kubernetes.io/projected/682a62d1-b6b0-4f0d-9f94-ba1f48d92447-kube-api-access-w5fln\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:56 crc kubenswrapper[4681]: I1007 17:23:56.382396 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:23:56 crc kubenswrapper[4681]: I1007 17:23:56.733452 4681 generic.go:334] "Generic (PLEG): container finished" podID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerID="d38e7941d43d36d6a81b8bd886fbe48613c61b6aece459831545f7a6d423f482" exitCode=0 Oct 07 17:23:56 crc kubenswrapper[4681]: I1007 17:23:56.734254 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" event={"ID":"01926d51-8e89-44e0-8032-7a701b7fcb92","Type":"ContainerDied","Data":"d38e7941d43d36d6a81b8bd886fbe48613c61b6aece459831545f7a6d423f482"} Oct 07 17:23:56 crc kubenswrapper[4681]: I1007 17:23:56.750919 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerStarted","Data":"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e"} Oct 07 17:23:56 crc kubenswrapper[4681]: I1007 17:23:56.761347 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerStarted","Data":"f1dd7f1aaf400668908a09d5ab40d8ff7a8fa0f00d27be483c484ab7f67cce3d"} Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.441894 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.618321 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.780657 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" event={"ID":"01926d51-8e89-44e0-8032-7a701b7fcb92","Type":"ContainerStarted","Data":"eb3685b49e8b00675ad0765596fdbdbd7b0d3c9265cfa5ffccd03c0e3023be48"} Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.780796 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.785229 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerStarted","Data":"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49"} Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.785295 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api-log" containerID="cri-o://388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e" gracePeriod=30 Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.785335 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.785370 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api" containerID="cri-o://705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49" gracePeriod=30 Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.802653 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerStarted","Data":"713a476ba3d7bcd7f60240d7acd21fef66d96a658846b9b494c9eb187900f229"} Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.817813 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" podStartSLOduration=5.817789135 podStartE2EDuration="5.817789135s" podCreationTimestamp="2025-10-07 17:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:57.806943213 +0000 UTC m=+1241.454354768" watchObservedRunningTime="2025-10-07 17:23:57.817789135 +0000 UTC m=+1241.465200690" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.862867 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.862846421 podStartE2EDuration="4.862846421s" podCreationTimestamp="2025-10-07 17:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:23:57.83910893 +0000 UTC m=+1241.486520485" watchObservedRunningTime="2025-10-07 17:23:57.862846421 +0000 UTC m=+1241.510257966" Oct 07 17:23:57 crc kubenswrapper[4681]: I1007 17:23:57.872823 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.42632621 podStartE2EDuration="5.872802918s" podCreationTimestamp="2025-10-07 17:23:52 +0000 UTC" firstStartedPulling="2025-10-07 17:23:54.022906203 +0000 UTC m=+1237.670317758" lastFinishedPulling="2025-10-07 17:23:55.469382911 +0000 UTC m=+1239.116794466" observedRunningTime="2025-10-07 17:23:57.86569909 +0000 UTC m=+1241.513110645" watchObservedRunningTime="2025-10-07 17:23:57.872802918 +0000 UTC m=+1241.520214473" Oct 07 17:23:58 crc kubenswrapper[4681]: I1007 17:23:58.156717 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 17:23:58 crc kubenswrapper[4681]: I1007 17:23:58.815464 4681 generic.go:334] "Generic (PLEG): container finished" podID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerID="388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e" exitCode=143 Oct 07 17:23:58 crc kubenswrapper[4681]: I1007 17:23:58.815953 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerDied","Data":"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e"} Oct 07 17:23:58 crc kubenswrapper[4681]: I1007 17:23:58.926420 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:59634->10.217.0.163:9311: read: connection reset by peer" Oct 07 17:23:58 crc kubenswrapper[4681]: I1007 17:23:58.926473 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-976bbb468-rxpr4" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:59642->10.217.0.163:9311: read: connection reset by peer" Oct 07 17:23:59 crc kubenswrapper[4681]: E1007 17:23:59.268536 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e22a43_39e2_4154_b998_dcc84cadf262.slice/crio-conmon-38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3.scope\": RecentStats: unable to find data in memory cache]" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.506579 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.594061 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom\") pod \"c0e22a43-39e2-4154-b998-dcc84cadf262\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.594133 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data\") pod \"c0e22a43-39e2-4154-b998-dcc84cadf262\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.594287 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb9nz\" (UniqueName: \"kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz\") pod \"c0e22a43-39e2-4154-b998-dcc84cadf262\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.594315 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle\") pod \"c0e22a43-39e2-4154-b998-dcc84cadf262\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.594398 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs\") pod \"c0e22a43-39e2-4154-b998-dcc84cadf262\" (UID: \"c0e22a43-39e2-4154-b998-dcc84cadf262\") " Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.595172 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs" (OuterVolumeSpecName: "logs") pod "c0e22a43-39e2-4154-b998-dcc84cadf262" (UID: "c0e22a43-39e2-4154-b998-dcc84cadf262"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.604064 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz" (OuterVolumeSpecName: "kube-api-access-xb9nz") pod "c0e22a43-39e2-4154-b998-dcc84cadf262" (UID: "c0e22a43-39e2-4154-b998-dcc84cadf262"). InnerVolumeSpecName "kube-api-access-xb9nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.604185 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0e22a43-39e2-4154-b998-dcc84cadf262" (UID: "c0e22a43-39e2-4154-b998-dcc84cadf262"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.642938 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0e22a43-39e2-4154-b998-dcc84cadf262" (UID: "c0e22a43-39e2-4154-b998-dcc84cadf262"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.671336 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data" (OuterVolumeSpecName: "config-data") pod "c0e22a43-39e2-4154-b998-dcc84cadf262" (UID: "c0e22a43-39e2-4154-b998-dcc84cadf262"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.696488 4681 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.696519 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.696528 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb9nz\" (UniqueName: \"kubernetes.io/projected/c0e22a43-39e2-4154-b998-dcc84cadf262-kube-api-access-xb9nz\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.696538 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e22a43-39e2-4154-b998-dcc84cadf262-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.696548 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e22a43-39e2-4154-b998-dcc84cadf262-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.824337 4681 generic.go:334] "Generic (PLEG): container finished" podID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerID="38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3" exitCode=0 Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.825130 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-976bbb468-rxpr4" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.825950 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerDied","Data":"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3"} Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.826418 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-976bbb468-rxpr4" event={"ID":"c0e22a43-39e2-4154-b998-dcc84cadf262","Type":"ContainerDied","Data":"14c1d6af19122277cbf0f32e17e6963bae818441771d97fcb9f3c9e41b9749b3"} Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.826438 4681 scope.go:117] "RemoveContainer" containerID="38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.862978 4681 scope.go:117] "RemoveContainer" containerID="553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.875707 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.888137 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-976bbb468-rxpr4"] Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.892288 4681 scope.go:117] "RemoveContainer" containerID="38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3" Oct 07 17:23:59 crc kubenswrapper[4681]: E1007 17:23:59.892808 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3\": container with ID starting with 38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3 not found: ID does not exist" containerID="38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.892839 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3"} err="failed to get container status \"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3\": rpc error: code = NotFound desc = could not find container \"38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3\": container with ID starting with 38420a1fbd331bf5c3a9fe6aa06da5ece1572a8b32a67d00aff20238dd72afd3 not found: ID does not exist" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.892861 4681 scope.go:117] "RemoveContainer" containerID="553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21" Oct 07 17:23:59 crc kubenswrapper[4681]: E1007 17:23:59.893256 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21\": container with ID starting with 553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21 not found: ID does not exist" containerID="553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21" Oct 07 17:23:59 crc kubenswrapper[4681]: I1007 17:23:59.893293 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21"} err="failed to get container status \"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21\": rpc error: code = NotFound desc = could not find container \"553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21\": container with ID starting with 553dc9dbf1649d562b756878a4c23cf75a55aab78eade6900e43db6f13dd0a21 not found: ID does not exist" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.628643 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-djn84"] Oct 07 17:24:00 crc kubenswrapper[4681]: E1007 17:24:00.629308 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629329 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" Oct 07 17:24:00 crc kubenswrapper[4681]: E1007 17:24:00.629343 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682a62d1-b6b0-4f0d-9f94-ba1f48d92447" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629350 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="682a62d1-b6b0-4f0d-9f94-ba1f48d92447" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: E1007 17:24:00.629357 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39303f5-20d2-4d09-8033-70a8c3ad916b" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629363 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39303f5-20d2-4d09-8033-70a8c3ad916b" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: E1007 17:24:00.629372 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fb8b33-cbe4-46dd-83b0-d35325b63940" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629378 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fb8b33-cbe4-46dd-83b0-d35325b63940" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: E1007 17:24:00.629412 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629418 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629583 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629602 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="682a62d1-b6b0-4f0d-9f94-ba1f48d92447" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629612 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39303f5-20d2-4d09-8033-70a8c3ad916b" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629625 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" containerName="barbican-api-log" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.629636 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fb8b33-cbe4-46dd-83b0-d35325b63940" containerName="mariadb-account-create" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.630248 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.634387 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rj5mp" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.634387 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.649835 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.670022 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-djn84"] Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.715112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.715175 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.715205 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5bv\" (UniqueName: \"kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.715243 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.817243 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.817306 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5bv\" (UniqueName: \"kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.817360 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.817527 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.823392 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.840587 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.840637 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.847151 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5bv\" (UniqueName: \"kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv\") pod \"nova-cell0-conductor-db-sync-djn84\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:00 crc kubenswrapper[4681]: I1007 17:24:00.946530 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:01 crc kubenswrapper[4681]: I1007 17:24:01.052182 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e22a43-39e2-4154-b998-dcc84cadf262" path="/var/lib/kubelet/pods/c0e22a43-39e2-4154-b998-dcc84cadf262/volumes" Oct 07 17:24:01 crc kubenswrapper[4681]: I1007 17:24:01.513636 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-djn84"] Oct 07 17:24:01 crc kubenswrapper[4681]: W1007 17:24:01.528372 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe2ce7e8_5280_4cfa_b2b1_d680465cd889.slice/crio-fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d WatchSource:0}: Error finding container fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d: Status 404 returned error can't find the container with id fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d Oct 07 17:24:01 crc kubenswrapper[4681]: I1007 17:24:01.882272 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-djn84" event={"ID":"be2ce7e8-5280-4cfa-b2b1-d680465cd889","Type":"ContainerStarted","Data":"fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d"} Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.263069 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.332295 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.332529 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="dnsmasq-dns" containerID="cri-o://b27cfe1a4a0bb7f071077c53c8144bc8a034567d2bdf6b1bf2ac69acd1a9b777" gracePeriod=10 Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.510478 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.575066 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.930773 4681 generic.go:334] "Generic (PLEG): container finished" podID="025296af-e542-46ae-a44e-9288982278e5" containerID="b27cfe1a4a0bb7f071077c53c8144bc8a034567d2bdf6b1bf2ac69acd1a9b777" exitCode=0 Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.943223 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" event={"ID":"025296af-e542-46ae-a44e-9288982278e5","Type":"ContainerDied","Data":"b27cfe1a4a0bb7f071077c53c8144bc8a034567d2bdf6b1bf2ac69acd1a9b777"} Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.943292 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" event={"ID":"025296af-e542-46ae-a44e-9288982278e5","Type":"ContainerDied","Data":"4b87779c8ebd9eb4548272913c1c728190b5d209f3b9762e47baf87f17227766"} Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.943313 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b87779c8ebd9eb4548272913c1c728190b5d209f3b9762e47baf87f17227766" Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.945683 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="cinder-scheduler" containerID="cri-o://f1dd7f1aaf400668908a09d5ab40d8ff7a8fa0f00d27be483c484ab7f67cce3d" gracePeriod=30 Oct 07 17:24:03 crc kubenswrapper[4681]: I1007 17:24:03.946137 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="probe" containerID="cri-o://713a476ba3d7bcd7f60240d7acd21fef66d96a658846b9b494c9eb187900f229" gracePeriod=30 Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.002733 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113422 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113514 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113580 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffn7x\" (UniqueName: \"kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113605 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113665 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.113713 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb\") pod \"025296af-e542-46ae-a44e-9288982278e5\" (UID: \"025296af-e542-46ae-a44e-9288982278e5\") " Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.153133 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x" (OuterVolumeSpecName: "kube-api-access-ffn7x") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "kube-api-access-ffn7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.219108 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffn7x\" (UniqueName: \"kubernetes.io/projected/025296af-e542-46ae-a44e-9288982278e5-kube-api-access-ffn7x\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.262826 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.273334 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.283264 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config" (OuterVolumeSpecName: "config") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.286312 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.286602 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "025296af-e542-46ae-a44e-9288982278e5" (UID: "025296af-e542-46ae-a44e-9288982278e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.321350 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.321385 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.321395 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.321404 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.321414 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/025296af-e542-46ae-a44e-9288982278e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.553367 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.949579 4681 generic.go:334] "Generic (PLEG): container finished" podID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerID="713a476ba3d7bcd7f60240d7acd21fef66d96a658846b9b494c9eb187900f229" exitCode=0 Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.949664 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-sczsk" Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.950097 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerDied","Data":"713a476ba3d7bcd7f60240d7acd21fef66d96a658846b9b494c9eb187900f229"} Oct 07 17:24:04 crc kubenswrapper[4681]: I1007 17:24:04.982978 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:24:05 crc kubenswrapper[4681]: I1007 17:24:05.002531 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-sczsk"] Oct 07 17:24:05 crc kubenswrapper[4681]: I1007 17:24:05.044444 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025296af-e542-46ae-a44e-9288982278e5" path="/var/lib/kubelet/pods/025296af-e542-46ae-a44e-9288982278e5/volumes" Oct 07 17:24:05 crc kubenswrapper[4681]: I1007 17:24:05.961792 4681 generic.go:334] "Generic (PLEG): container finished" podID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerID="f1dd7f1aaf400668908a09d5ab40d8ff7a8fa0f00d27be483c484ab7f67cce3d" exitCode=0 Oct 07 17:24:05 crc kubenswrapper[4681]: I1007 17:24:05.961869 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerDied","Data":"f1dd7f1aaf400668908a09d5ab40d8ff7a8fa0f00d27be483c484ab7f67cce3d"} Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.484794 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564420 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564535 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrgp\" (UniqueName: \"kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564580 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564616 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564653 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.564770 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id\") pod \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\" (UID: \"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c\") " Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.565172 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.573612 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.584212 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp" (OuterVolumeSpecName: "kube-api-access-lbrgp") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "kube-api-access-lbrgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.584491 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts" (OuterVolumeSpecName: "scripts") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.673108 4681 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.673136 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.673145 4681 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.673153 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrgp\" (UniqueName: \"kubernetes.io/projected/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-kube-api-access-lbrgp\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.698132 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.727696 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data" (OuterVolumeSpecName: "config-data") pod "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" (UID: "93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.774832 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.774859 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.975290 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c","Type":"ContainerDied","Data":"2403faf2f3f955e0b4e445e07296a6e4c55e90a390611151e2fd447d40f540a0"} Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.975343 4681 scope.go:117] "RemoveContainer" containerID="713a476ba3d7bcd7f60240d7acd21fef66d96a658846b9b494c9eb187900f229" Oct 07 17:24:06 crc kubenswrapper[4681]: I1007 17:24:06.975347 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.017430 4681 scope.go:117] "RemoveContainer" containerID="f1dd7f1aaf400668908a09d5ab40d8ff7a8fa0f00d27be483c484ab7f67cce3d" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.025955 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.052163 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.052367 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:07 crc kubenswrapper[4681]: E1007 17:24:07.052680 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="cinder-scheduler" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055054 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="cinder-scheduler" Oct 07 17:24:07 crc kubenswrapper[4681]: E1007 17:24:07.055133 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="init" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055191 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="init" Oct 07 17:24:07 crc kubenswrapper[4681]: E1007 17:24:07.055315 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="probe" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055377 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="probe" Oct 07 17:24:07 crc kubenswrapper[4681]: E1007 17:24:07.055438 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="dnsmasq-dns" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055503 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="dnsmasq-dns" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055786 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="025296af-e542-46ae-a44e-9288982278e5" containerName="dnsmasq-dns" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055850 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="cinder-scheduler" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.055977 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" containerName="probe" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.058467 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.063541 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.115273 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.185823 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.185893 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.185913 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mp4\" (UniqueName: \"kubernetes.io/projected/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-kube-api-access-k8mp4\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.185960 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-scripts\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.186033 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.186061 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287651 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-scripts\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287764 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287816 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287904 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287932 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287935 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.287951 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mp4\" (UniqueName: \"kubernetes.io/projected/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-kube-api-access-k8mp4\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.307950 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.307973 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-scripts\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.308258 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.310645 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.313460 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mp4\" (UniqueName: \"kubernetes.io/projected/541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9-kube-api-access-k8mp4\") pod \"cinder-scheduler-0\" (UID: \"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9\") " pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.384289 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.441043 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.441126 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.441860 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db"} pod="openstack/horizon-64677bd694-6xgb2" containerMessage="Container horizon failed startup probe, will be restarted" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.441907 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" containerID="cri-o://af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db" gracePeriod=30 Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.626719 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.627041 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.627753 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a2ef2c60f997a9c728de5a3cb38dc728740b1786f9bd5808e689dbe3f49f3013"} pod="openstack/horizon-f945f854d-hm49c" containerMessage="Container horizon failed startup probe, will be restarted" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.627787 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" containerID="cri-o://a2ef2c60f997a9c728de5a3cb38dc728740b1786f9bd5808e689dbe3f49f3013" gracePeriod=30 Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.854247 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 17:24:07 crc kubenswrapper[4681]: I1007 17:24:07.972139 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 17:24:08 crc kubenswrapper[4681]: I1007 17:24:08.173323 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b94d78545-dfdgb" Oct 07 17:24:08 crc kubenswrapper[4681]: I1007 17:24:08.228183 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:24:08 crc kubenswrapper[4681]: I1007 17:24:08.228403 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67c6d485d8-ww4wp" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-api" containerID="cri-o://dbbb9faedc762111f44aab2a08b7341d199126351c97bd8b447a1397e913d93a" gracePeriod=30 Oct 07 17:24:08 crc kubenswrapper[4681]: I1007 17:24:08.228757 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67c6d485d8-ww4wp" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-httpd" containerID="cri-o://dd8f65c730c61845fe20219f2d0e5fe26b5bbf646d4b0ae5bc18d711ed0d103c" gracePeriod=30 Oct 07 17:24:09 crc kubenswrapper[4681]: I1007 17:24:09.039651 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c" path="/var/lib/kubelet/pods/93cbdc0c-9d0f-4c8f-a43b-48e86e8ba49c/volumes" Oct 07 17:24:09 crc kubenswrapper[4681]: I1007 17:24:09.049488 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9","Type":"ContainerStarted","Data":"6c86b796f8ea8b091eafdc05a5cc3efed512b93760cfdf1b24d8c75dcfc95697"} Oct 07 17:24:09 crc kubenswrapper[4681]: I1007 17:24:09.051142 4681 generic.go:334] "Generic (PLEG): container finished" podID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerID="dd8f65c730c61845fe20219f2d0e5fe26b5bbf646d4b0ae5bc18d711ed0d103c" exitCode=0 Oct 07 17:24:09 crc kubenswrapper[4681]: I1007 17:24:09.051166 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerDied","Data":"dd8f65c730c61845fe20219f2d0e5fe26b5bbf646d4b0ae5bc18d711ed0d103c"} Oct 07 17:24:11 crc kubenswrapper[4681]: I1007 17:24:11.799799 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 17:24:12 crc kubenswrapper[4681]: I1007 17:24:12.195169 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:24:12 crc kubenswrapper[4681]: I1007 17:24:12.195498 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:24:14 crc kubenswrapper[4681]: I1007 17:24:14.108629 4681 generic.go:334] "Generic (PLEG): container finished" podID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerID="dbbb9faedc762111f44aab2a08b7341d199126351c97bd8b447a1397e913d93a" exitCode=0 Oct 07 17:24:14 crc kubenswrapper[4681]: I1007 17:24:14.108983 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerDied","Data":"dbbb9faedc762111f44aab2a08b7341d199126351c97bd8b447a1397e913d93a"} Oct 07 17:24:16 crc kubenswrapper[4681]: I1007 17:24:16.278111 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:16 crc kubenswrapper[4681]: I1007 17:24:16.278314 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" containerName="kube-state-metrics" containerID="cri-o://a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91" gracePeriod=30 Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.171765 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.181792 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-djn84" event={"ID":"be2ce7e8-5280-4cfa-b2b1-d680465cd889","Type":"ContainerStarted","Data":"db30f3a7f2355011a879a9893bac5559f8ade79cfc795e4262ee88633bf5eb1c"} Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.205583 4681 generic.go:334] "Generic (PLEG): container finished" podID="88c0d090-0803-4fff-a9a3-9b41529b8a23" containerID="a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91" exitCode=2 Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.205727 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.205747 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c0d090-0803-4fff-a9a3-9b41529b8a23","Type":"ContainerDied","Data":"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91"} Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.210074 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"88c0d090-0803-4fff-a9a3-9b41529b8a23","Type":"ContainerDied","Data":"14706f92fb10e846b1e3989b0f6e54ec10fd237202a3b8630730084e83732610"} Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.210136 4681 scope.go:117] "RemoveContainer" containerID="a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.263490 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-djn84" podStartSLOduration=2.179476358 podStartE2EDuration="17.263469598s" podCreationTimestamp="2025-10-07 17:24:00 +0000 UTC" firstStartedPulling="2025-10-07 17:24:01.530372155 +0000 UTC m=+1245.177783710" lastFinishedPulling="2025-10-07 17:24:16.614365395 +0000 UTC m=+1260.261776950" observedRunningTime="2025-10-07 17:24:17.209871085 +0000 UTC m=+1260.857282640" watchObservedRunningTime="2025-10-07 17:24:17.263469598 +0000 UTC m=+1260.910881153" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.278264 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.293456 4681 scope.go:117] "RemoveContainer" containerID="a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91" Oct 07 17:24:17 crc kubenswrapper[4681]: E1007 17:24:17.294452 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91\": container with ID starting with a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91 not found: ID does not exist" containerID="a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.294478 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91"} err="failed to get container status \"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91\": rpc error: code = NotFound desc = could not find container \"a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91\": container with ID starting with a150eb7edc06fc7b830f6ee5465ba9379608e4d886850e0a437b8f3dd5a28a91 not found: ID does not exist" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.352808 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl6tt\" (UniqueName: \"kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt\") pod \"88c0d090-0803-4fff-a9a3-9b41529b8a23\" (UID: \"88c0d090-0803-4fff-a9a3-9b41529b8a23\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.360566 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt" (OuterVolumeSpecName: "kube-api-access-vl6tt") pod "88c0d090-0803-4fff-a9a3-9b41529b8a23" (UID: "88c0d090-0803-4fff-a9a3-9b41529b8a23"). InnerVolumeSpecName "kube-api-access-vl6tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.454631 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config\") pod \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.454695 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config\") pod \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.454736 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p2h6\" (UniqueName: \"kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6\") pod \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.454810 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs\") pod \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.454842 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle\") pod \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\" (UID: \"f40de6a5-783a-4e65-8cfc-dd84f8652b6f\") " Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.455326 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl6tt\" (UniqueName: \"kubernetes.io/projected/88c0d090-0803-4fff-a9a3-9b41529b8a23-kube-api-access-vl6tt\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.464007 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f40de6a5-783a-4e65-8cfc-dd84f8652b6f" (UID: "f40de6a5-783a-4e65-8cfc-dd84f8652b6f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.465018 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6" (OuterVolumeSpecName: "kube-api-access-6p2h6") pod "f40de6a5-783a-4e65-8cfc-dd84f8652b6f" (UID: "f40de6a5-783a-4e65-8cfc-dd84f8652b6f"). InnerVolumeSpecName "kube-api-access-6p2h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.557210 4681 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.557238 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p2h6\" (UniqueName: \"kubernetes.io/projected/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-kube-api-access-6p2h6\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.578600 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config" (OuterVolumeSpecName: "config") pod "f40de6a5-783a-4e65-8cfc-dd84f8652b6f" (UID: "f40de6a5-783a-4e65-8cfc-dd84f8652b6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.611077 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f40de6a5-783a-4e65-8cfc-dd84f8652b6f" (UID: "f40de6a5-783a-4e65-8cfc-dd84f8652b6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.656058 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f40de6a5-783a-4e65-8cfc-dd84f8652b6f" (UID: "f40de6a5-783a-4e65-8cfc-dd84f8652b6f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.661339 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.661364 4681 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.661375 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f40de6a5-783a-4e65-8cfc-dd84f8652b6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.839349 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.869612 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.881157 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:17 crc kubenswrapper[4681]: E1007 17:24:17.881775 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-api" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.881786 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-api" Oct 07 17:24:17 crc kubenswrapper[4681]: E1007 17:24:17.881799 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" containerName="kube-state-metrics" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.881805 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" containerName="kube-state-metrics" Oct 07 17:24:17 crc kubenswrapper[4681]: E1007 17:24:17.881842 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-httpd" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.881849 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-httpd" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.882022 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-httpd" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.882036 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" containerName="kube-state-metrics" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.882046 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" containerName="neutron-api" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.882628 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.886274 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.886456 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 17:24:17 crc kubenswrapper[4681]: I1007 17:24:17.894227 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.072134 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.072226 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.072264 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.072295 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfg8z\" (UniqueName: \"kubernetes.io/projected/92e5095e-22e9-46b1-900a-492f827a05eb-kube-api-access-nfg8z\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.174176 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.174491 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfg8z\" (UniqueName: \"kubernetes.io/projected/92e5095e-22e9-46b1-900a-492f827a05eb-kube-api-access-nfg8z\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.174583 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.174653 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.181042 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.181725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.189237 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e5095e-22e9-46b1-900a-492f827a05eb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.193104 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfg8z\" (UniqueName: \"kubernetes.io/projected/92e5095e-22e9-46b1-900a-492f827a05eb-kube-api-access-nfg8z\") pod \"kube-state-metrics-0\" (UID: \"92e5095e-22e9-46b1-900a-492f827a05eb\") " pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.230404 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.242526 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c6d485d8-ww4wp" event={"ID":"f40de6a5-783a-4e65-8cfc-dd84f8652b6f","Type":"ContainerDied","Data":"94551cb12b5d6ba70361ea61af8f982558a814055961036dceeca55b353add80"} Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.242577 4681 scope.go:117] "RemoveContainer" containerID="dd8f65c730c61845fe20219f2d0e5fe26b5bbf646d4b0ae5bc18d711ed0d103c" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.242684 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c6d485d8-ww4wp" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.251724 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9","Type":"ContainerStarted","Data":"0407de19e3041c507d9097f6b59faab980973c4f533e2ddb306d3fc204096b6a"} Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.293010 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.320475 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67c6d485d8-ww4wp"] Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.329721 4681 scope.go:117] "RemoveContainer" containerID="dbbb9faedc762111f44aab2a08b7341d199126351c97bd8b447a1397e913d93a" Oct 07 17:24:18 crc kubenswrapper[4681]: I1007 17:24:18.628858 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 17:24:19 crc kubenswrapper[4681]: I1007 17:24:19.047779 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c0d090-0803-4fff-a9a3-9b41529b8a23" path="/var/lib/kubelet/pods/88c0d090-0803-4fff-a9a3-9b41529b8a23/volumes" Oct 07 17:24:19 crc kubenswrapper[4681]: I1007 17:24:19.049052 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f40de6a5-783a-4e65-8cfc-dd84f8652b6f" path="/var/lib/kubelet/pods/f40de6a5-783a-4e65-8cfc-dd84f8652b6f/volumes" Oct 07 17:24:19 crc kubenswrapper[4681]: I1007 17:24:19.267892 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9","Type":"ContainerStarted","Data":"3374871a69118a3ce4d930dbf691cc0207e4fd4ee47c93d7598a71edf417fdcb"} Oct 07 17:24:19 crc kubenswrapper[4681]: I1007 17:24:19.270312 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"92e5095e-22e9-46b1-900a-492f827a05eb","Type":"ContainerStarted","Data":"16c4d9e83334618f8dcab89f607577440a297dfe7492673d844ef95fa4e75f7f"} Oct 07 17:24:19 crc kubenswrapper[4681]: I1007 17:24:19.305210 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.305194929 podStartE2EDuration="12.305194929s" podCreationTimestamp="2025-10-07 17:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:19.299234873 +0000 UTC m=+1262.946646428" watchObservedRunningTime="2025-10-07 17:24:19.305194929 +0000 UTC m=+1262.952606484" Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.281588 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"92e5095e-22e9-46b1-900a-492f827a05eb","Type":"ContainerStarted","Data":"ecfbac54b0b6e218908f4825c58187a1ebd3329173b5cba25fd12f8369e18ef0"} Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.281909 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.309305 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.952559344 podStartE2EDuration="3.309287403s" podCreationTimestamp="2025-10-07 17:24:17 +0000 UTC" firstStartedPulling="2025-10-07 17:24:18.652798553 +0000 UTC m=+1262.300210118" lastFinishedPulling="2025-10-07 17:24:19.009526622 +0000 UTC m=+1262.656938177" observedRunningTime="2025-10-07 17:24:20.305507497 +0000 UTC m=+1263.952919052" watchObservedRunningTime="2025-10-07 17:24:20.309287403 +0000 UTC m=+1263.956698958" Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.530970 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.531244 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-central-agent" containerID="cri-o://541b74afc145f6f164980d4956e23d08a3fe408daa562e5e211972ea8f36f713" gracePeriod=30 Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.531628 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="proxy-httpd" containerID="cri-o://0b7e194c466bfda2a65b9469af21a010c6ca66dfe818fadf0281802b1a8bd81c" gracePeriod=30 Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.531674 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="sg-core" containerID="cri-o://abdf8223b4adb020702f680ededf76a5d25ca568e07de8c6214d1a24ac0a59e5" gracePeriod=30 Oct 07 17:24:20 crc kubenswrapper[4681]: I1007 17:24:20.531707 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-notification-agent" containerID="cri-o://1c4b49320b5dd5267cffbb1b0a3e5df00a7564f7bbd6046738f059f604d93c65" gracePeriod=30 Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291140 4681 generic.go:334] "Generic (PLEG): container finished" podID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerID="0b7e194c466bfda2a65b9469af21a010c6ca66dfe818fadf0281802b1a8bd81c" exitCode=0 Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291181 4681 generic.go:334] "Generic (PLEG): container finished" podID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerID="abdf8223b4adb020702f680ededf76a5d25ca568e07de8c6214d1a24ac0a59e5" exitCode=2 Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291191 4681 generic.go:334] "Generic (PLEG): container finished" podID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerID="541b74afc145f6f164980d4956e23d08a3fe408daa562e5e211972ea8f36f713" exitCode=0 Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291213 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerDied","Data":"0b7e194c466bfda2a65b9469af21a010c6ca66dfe818fadf0281802b1a8bd81c"} Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291257 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerDied","Data":"abdf8223b4adb020702f680ededf76a5d25ca568e07de8c6214d1a24ac0a59e5"} Oct 07 17:24:21 crc kubenswrapper[4681]: I1007 17:24:21.291269 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerDied","Data":"541b74afc145f6f164980d4956e23d08a3fe408daa562e5e211972ea8f36f713"} Oct 07 17:24:22 crc kubenswrapper[4681]: I1007 17:24:22.385779 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 17:24:22 crc kubenswrapper[4681]: I1007 17:24:22.619641 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.320792 4681 generic.go:334] "Generic (PLEG): container finished" podID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerID="1c4b49320b5dd5267cffbb1b0a3e5df00a7564f7bbd6046738f059f604d93c65" exitCode=0 Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.321151 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerDied","Data":"1c4b49320b5dd5267cffbb1b0a3e5df00a7564f7bbd6046738f059f604d93c65"} Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.683939 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.802859 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nd6m\" (UniqueName: \"kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.802934 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803046 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803089 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803104 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803362 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803373 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803410 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml\") pod \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\" (UID: \"932e68dd-1e76-4bf3-8fe6-4d34de164e74\") " Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.803780 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.804487 4681 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.804518 4681 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/932e68dd-1e76-4bf3-8fe6-4d34de164e74-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.809111 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts" (OuterVolumeSpecName: "scripts") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.827280 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m" (OuterVolumeSpecName: "kube-api-access-7nd6m") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "kube-api-access-7nd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.849814 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.907013 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.907049 4681 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.907064 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nd6m\" (UniqueName: \"kubernetes.io/projected/932e68dd-1e76-4bf3-8fe6-4d34de164e74-kube-api-access-7nd6m\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.914033 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:24 crc kubenswrapper[4681]: I1007 17:24:24.938020 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data" (OuterVolumeSpecName: "config-data") pod "932e68dd-1e76-4bf3-8fe6-4d34de164e74" (UID: "932e68dd-1e76-4bf3-8fe6-4d34de164e74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.008872 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.008924 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/932e68dd-1e76-4bf3-8fe6-4d34de164e74-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.332952 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"932e68dd-1e76-4bf3-8fe6-4d34de164e74","Type":"ContainerDied","Data":"09f90350129fbcfdcdc9cd3cabcb142e2ea3ee7883236ed857ecb4396d580d42"} Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.333011 4681 scope.go:117] "RemoveContainer" containerID="0b7e194c466bfda2a65b9469af21a010c6ca66dfe818fadf0281802b1a8bd81c" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.333135 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.366274 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.393025 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.400737 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:25 crc kubenswrapper[4681]: E1007 17:24:25.401297 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="proxy-httpd" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401316 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="proxy-httpd" Oct 07 17:24:25 crc kubenswrapper[4681]: E1007 17:24:25.401327 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="sg-core" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401335 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="sg-core" Oct 07 17:24:25 crc kubenswrapper[4681]: E1007 17:24:25.401359 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-notification-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401367 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-notification-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: E1007 17:24:25.401375 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-central-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401382 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-central-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401623 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-central-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401641 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="sg-core" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401666 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="ceilometer-notification-agent" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.401677 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" containerName="proxy-httpd" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.403659 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.409115 4681 scope.go:117] "RemoveContainer" containerID="abdf8223b4adb020702f680ededf76a5d25ca568e07de8c6214d1a24ac0a59e5" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.409365 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.409625 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.409768 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.417523 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.430603 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.458111 4681 scope.go:117] "RemoveContainer" containerID="1c4b49320b5dd5267cffbb1b0a3e5df00a7564f7bbd6046738f059f604d93c65" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.476893 4681 scope.go:117] "RemoveContainer" containerID="541b74afc145f6f164980d4956e23d08a3fe408daa562e5e211972ea8f36f713" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.520694 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.520745 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2j72\" (UniqueName: \"kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.520781 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.520930 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.520971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.521019 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.521057 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.521284 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.531773 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622795 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2j72\" (UniqueName: \"kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622896 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622932 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622951 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.622980 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.623050 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.623452 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.623550 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.626974 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.628425 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.634719 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.638302 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.639019 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2j72\" (UniqueName: \"kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72\") pod \"ceilometer-0\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " pod="openstack/ceilometer-0" Oct 07 17:24:25 crc kubenswrapper[4681]: I1007 17:24:25.742071 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:26 crc kubenswrapper[4681]: I1007 17:24:26.185807 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:26 crc kubenswrapper[4681]: I1007 17:24:26.344439 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerStarted","Data":"605814b80ef71c259522613475421b4e12d362196df1edf6b9d796012f28369d"} Oct 07 17:24:27 crc kubenswrapper[4681]: I1007 17:24:27.060588 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="932e68dd-1e76-4bf3-8fe6-4d34de164e74" path="/var/lib/kubelet/pods/932e68dd-1e76-4bf3-8fe6-4d34de164e74/volumes" Oct 07 17:24:27 crc kubenswrapper[4681]: I1007 17:24:27.354434 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerStarted","Data":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.247787 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.282553 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.365756 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerStarted","Data":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.369730 4681 generic.go:334] "Generic (PLEG): container finished" podID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerID="705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49" exitCode=137 Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.369767 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerDied","Data":"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49"} Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.369805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7199ab3e-07fd-4c98-81e1-535f69a0f76d","Type":"ContainerDied","Data":"d0d019e3322dc416a62b0b0eae3deba3b2dfbc58229e84a4a77b554f459396c1"} Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.369821 4681 scope.go:117] "RemoveContainer" containerID="705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.370029 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399411 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7vm\" (UniqueName: \"kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399499 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399524 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399616 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399665 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.399695 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle\") pod \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\" (UID: \"7199ab3e-07fd-4c98-81e1-535f69a0f76d\") " Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.402036 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.402362 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs" (OuterVolumeSpecName: "logs") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.405018 4681 scope.go:117] "RemoveContainer" containerID="388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.406057 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts" (OuterVolumeSpecName: "scripts") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.408629 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm" (OuterVolumeSpecName: "kube-api-access-9f7vm") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "kube-api-access-9f7vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.421131 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.471805 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505164 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7199ab3e-07fd-4c98-81e1-535f69a0f76d-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505196 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505207 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f7vm\" (UniqueName: \"kubernetes.io/projected/7199ab3e-07fd-4c98-81e1-535f69a0f76d-kube-api-access-9f7vm\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505216 4681 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505224 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.505234 4681 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7199ab3e-07fd-4c98-81e1-535f69a0f76d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.556013 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data" (OuterVolumeSpecName: "config-data") pod "7199ab3e-07fd-4c98-81e1-535f69a0f76d" (UID: "7199ab3e-07fd-4c98-81e1-535f69a0f76d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.606420 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199ab3e-07fd-4c98-81e1-535f69a0f76d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.621956 4681 scope.go:117] "RemoveContainer" containerID="705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49" Oct 07 17:24:28 crc kubenswrapper[4681]: E1007 17:24:28.622476 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49\": container with ID starting with 705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49 not found: ID does not exist" containerID="705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.622532 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49"} err="failed to get container status \"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49\": rpc error: code = NotFound desc = could not find container \"705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49\": container with ID starting with 705c9e025edf8e6fa16d7fe514262f1224439c95283275135d74f33374cc8e49 not found: ID does not exist" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.622558 4681 scope.go:117] "RemoveContainer" containerID="388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e" Oct 07 17:24:28 crc kubenswrapper[4681]: E1007 17:24:28.624514 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e\": container with ID starting with 388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e not found: ID does not exist" containerID="388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.624588 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e"} err="failed to get container status \"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e\": rpc error: code = NotFound desc = could not find container \"388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e\": container with ID starting with 388fc72fde632c23d72981d68bc7c1f3915f6462bc0048c894f56576ec24d24e not found: ID does not exist" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.703254 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.718866 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.727435 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:24:28 crc kubenswrapper[4681]: E1007 17:24:28.727850 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.727866 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api" Oct 07 17:24:28 crc kubenswrapper[4681]: E1007 17:24:28.727905 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api-log" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.727911 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api-log" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.728082 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api-log" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.728097 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" containerName="cinder-api" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.733603 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.735816 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.735930 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.736318 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.736521 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.809847 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rfc\" (UniqueName: \"kubernetes.io/projected/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-kube-api-access-v4rfc\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.809973 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-logs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810007 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810046 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810066 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810119 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-scripts\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810166 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810249 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.810278 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.911834 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.911944 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912015 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912057 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912098 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-scripts\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912129 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912657 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912690 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912726 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rfc\" (UniqueName: \"kubernetes.io/projected/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-kube-api-access-v4rfc\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.912788 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-logs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.913098 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-logs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.918162 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-scripts\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.919572 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.933204 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.933783 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.933940 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-config-data-custom\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.938381 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rfc\" (UniqueName: \"kubernetes.io/projected/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-kube-api-access-v4rfc\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:28 crc kubenswrapper[4681]: I1007 17:24:28.941427 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7\") " pod="openstack/cinder-api-0" Oct 07 17:24:29 crc kubenswrapper[4681]: I1007 17:24:29.039853 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7199ab3e-07fd-4c98-81e1-535f69a0f76d" path="/var/lib/kubelet/pods/7199ab3e-07fd-4c98-81e1-535f69a0f76d/volumes" Oct 07 17:24:29 crc kubenswrapper[4681]: I1007 17:24:29.049156 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 17:24:29 crc kubenswrapper[4681]: I1007 17:24:29.393770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerStarted","Data":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} Oct 07 17:24:29 crc kubenswrapper[4681]: I1007 17:24:29.515753 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 17:24:29 crc kubenswrapper[4681]: W1007 17:24:29.525784 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d12fb3d_a5e9_450f_a6c5_abde4bb79bc7.slice/crio-94b5bd4b10ef6b97eab41a1a084b8bf72f197aac97344bb5eb9d6559725b8972 WatchSource:0}: Error finding container 94b5bd4b10ef6b97eab41a1a084b8bf72f197aac97344bb5eb9d6559725b8972: Status 404 returned error can't find the container with id 94b5bd4b10ef6b97eab41a1a084b8bf72f197aac97344bb5eb9d6559725b8972 Oct 07 17:24:30 crc kubenswrapper[4681]: I1007 17:24:30.425227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7","Type":"ContainerStarted","Data":"2b78bc0d1d46d4176f97c916e63e1a4ccd688c455b89670a839b855a1e804e51"} Oct 07 17:24:30 crc kubenswrapper[4681]: I1007 17:24:30.425457 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7","Type":"ContainerStarted","Data":"94b5bd4b10ef6b97eab41a1a084b8bf72f197aac97344bb5eb9d6559725b8972"} Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.436452 4681 generic.go:334] "Generic (PLEG): container finished" podID="be2ce7e8-5280-4cfa-b2b1-d680465cd889" containerID="db30f3a7f2355011a879a9893bac5559f8ade79cfc795e4262ee88633bf5eb1c" exitCode=0 Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.436528 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-djn84" event={"ID":"be2ce7e8-5280-4cfa-b2b1-d680465cd889","Type":"ContainerDied","Data":"db30f3a7f2355011a879a9893bac5559f8ade79cfc795e4262ee88633bf5eb1c"} Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.440013 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerStarted","Data":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.440143 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.442105 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7","Type":"ContainerStarted","Data":"de1b4cfa7af746cc8b625c913e6f2d5f1332da8a6c8d602ca5fa668ebedb7eff"} Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.442255 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.487815 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.487796916 podStartE2EDuration="3.487796916s" podCreationTimestamp="2025-10-07 17:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:31.479618178 +0000 UTC m=+1275.127029733" watchObservedRunningTime="2025-10-07 17:24:31.487796916 +0000 UTC m=+1275.135208471" Oct 07 17:24:31 crc kubenswrapper[4681]: I1007 17:24:31.503427 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.068814238 podStartE2EDuration="6.503407281s" podCreationTimestamp="2025-10-07 17:24:25 +0000 UTC" firstStartedPulling="2025-10-07 17:24:26.196985149 +0000 UTC m=+1269.844396704" lastFinishedPulling="2025-10-07 17:24:30.631578192 +0000 UTC m=+1274.278989747" observedRunningTime="2025-10-07 17:24:31.500278634 +0000 UTC m=+1275.147690199" watchObservedRunningTime="2025-10-07 17:24:31.503407281 +0000 UTC m=+1275.150818826" Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.824770 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.991946 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data\") pod \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.992814 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5bv\" (UniqueName: \"kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv\") pod \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.992852 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts\") pod \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.993012 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle\") pod \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\" (UID: \"be2ce7e8-5280-4cfa-b2b1-d680465cd889\") " Oct 07 17:24:32 crc kubenswrapper[4681]: I1007 17:24:32.998153 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts" (OuterVolumeSpecName: "scripts") pod "be2ce7e8-5280-4cfa-b2b1-d680465cd889" (UID: "be2ce7e8-5280-4cfa-b2b1-d680465cd889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.014185 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv" (OuterVolumeSpecName: "kube-api-access-nw5bv") pod "be2ce7e8-5280-4cfa-b2b1-d680465cd889" (UID: "be2ce7e8-5280-4cfa-b2b1-d680465cd889"). InnerVolumeSpecName "kube-api-access-nw5bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.019030 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data" (OuterVolumeSpecName: "config-data") pod "be2ce7e8-5280-4cfa-b2b1-d680465cd889" (UID: "be2ce7e8-5280-4cfa-b2b1-d680465cd889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.026689 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be2ce7e8-5280-4cfa-b2b1-d680465cd889" (UID: "be2ce7e8-5280-4cfa-b2b1-d680465cd889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.095215 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.095252 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.095266 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5bv\" (UniqueName: \"kubernetes.io/projected/be2ce7e8-5280-4cfa-b2b1-d680465cd889-kube-api-access-nw5bv\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.095279 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be2ce7e8-5280-4cfa-b2b1-d680465cd889-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.460941 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-djn84" event={"ID":"be2ce7e8-5280-4cfa-b2b1-d680465cd889","Type":"ContainerDied","Data":"fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d"} Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.460968 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-djn84" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.461173 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa03ba9b6556f90d5e204175c9dea6ce67a3ba0c0cfb71a7481d2ecc4fdb820d" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.553230 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 17:24:33 crc kubenswrapper[4681]: E1007 17:24:33.553714 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2ce7e8-5280-4cfa-b2b1-d680465cd889" containerName="nova-cell0-conductor-db-sync" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.553739 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2ce7e8-5280-4cfa-b2b1-d680465cd889" containerName="nova-cell0-conductor-db-sync" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.554015 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2ce7e8-5280-4cfa-b2b1-d680465cd889" containerName="nova-cell0-conductor-db-sync" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.554742 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.558282 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.560191 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rj5mp" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.572335 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.709124 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvtr\" (UniqueName: \"kubernetes.io/projected/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-kube-api-access-jwvtr\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.709242 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.709345 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.714424 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.714859 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-notification-agent" containerID="cri-o://5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" gracePeriod=30 Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.715008 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="proxy-httpd" containerID="cri-o://6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" gracePeriod=30 Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.715001 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="sg-core" containerID="cri-o://61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" gracePeriod=30 Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.714828 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-central-agent" containerID="cri-o://aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" gracePeriod=30 Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.811528 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.811611 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.811716 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvtr\" (UniqueName: \"kubernetes.io/projected/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-kube-api-access-jwvtr\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.823774 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.824666 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.848205 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvtr\" (UniqueName: \"kubernetes.io/projected/ef096ee9-933c-44da-a4b7-6cc5b62ecc49-kube-api-access-jwvtr\") pod \"nova-cell0-conductor-0\" (UID: \"ef096ee9-933c-44da-a4b7-6cc5b62ecc49\") " pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:33 crc kubenswrapper[4681]: I1007 17:24:33.870393 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.343423 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.463190 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.474583 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef096ee9-933c-44da-a4b7-6cc5b62ecc49","Type":"ContainerStarted","Data":"ae666d32e5fe96c772527585948837c9b64273444f1989afac6514d5ce411e0f"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495245 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" exitCode=0 Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495276 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" exitCode=2 Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495285 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" exitCode=0 Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495292 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" exitCode=0 Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495323 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerDied","Data":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495344 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerDied","Data":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495354 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerDied","Data":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495363 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerDied","Data":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495370 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfc5c9d4-2144-461c-b1ff-9061cb201ad6","Type":"ContainerDied","Data":"605814b80ef71c259522613475421b4e12d362196df1edf6b9d796012f28369d"} Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495401 4681 scope.go:117] "RemoveContainer" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.495532 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.521344 4681 scope.go:117] "RemoveContainer" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.542070 4681 scope.go:117] "RemoveContainer" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.559288 4681 scope.go:117] "RemoveContainer" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.578118 4681 scope.go:117] "RemoveContainer" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.578615 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": container with ID starting with 6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74 not found: ID does not exist" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.578661 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} err="failed to get container status \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": rpc error: code = NotFound desc = could not find container \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": container with ID starting with 6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.578689 4681 scope.go:117] "RemoveContainer" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.579116 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": container with ID starting with 61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc not found: ID does not exist" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579148 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} err="failed to get container status \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": rpc error: code = NotFound desc = could not find container \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": container with ID starting with 61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579176 4681 scope.go:117] "RemoveContainer" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.579419 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": container with ID starting with 5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc not found: ID does not exist" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579450 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} err="failed to get container status \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": rpc error: code = NotFound desc = could not find container \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": container with ID starting with 5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579466 4681 scope.go:117] "RemoveContainer" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.579715 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": container with ID starting with aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81 not found: ID does not exist" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579743 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} err="failed to get container status \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": rpc error: code = NotFound desc = could not find container \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": container with ID starting with aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.579759 4681 scope.go:117] "RemoveContainer" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580036 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} err="failed to get container status \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": rpc error: code = NotFound desc = could not find container \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": container with ID starting with 6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580056 4681 scope.go:117] "RemoveContainer" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580258 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} err="failed to get container status \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": rpc error: code = NotFound desc = could not find container \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": container with ID starting with 61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580281 4681 scope.go:117] "RemoveContainer" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580595 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} err="failed to get container status \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": rpc error: code = NotFound desc = could not find container \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": container with ID starting with 5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580617 4681 scope.go:117] "RemoveContainer" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.580974 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} err="failed to get container status \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": rpc error: code = NotFound desc = could not find container \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": container with ID starting with aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581011 4681 scope.go:117] "RemoveContainer" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581321 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} err="failed to get container status \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": rpc error: code = NotFound desc = could not find container \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": container with ID starting with 6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581344 4681 scope.go:117] "RemoveContainer" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581577 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} err="failed to get container status \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": rpc error: code = NotFound desc = could not find container \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": container with ID starting with 61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581604 4681 scope.go:117] "RemoveContainer" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581921 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} err="failed to get container status \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": rpc error: code = NotFound desc = could not find container \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": container with ID starting with 5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.581951 4681 scope.go:117] "RemoveContainer" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.582171 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} err="failed to get container status \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": rpc error: code = NotFound desc = could not find container \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": container with ID starting with aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.582195 4681 scope.go:117] "RemoveContainer" containerID="6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.582793 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74"} err="failed to get container status \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": rpc error: code = NotFound desc = could not find container \"6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74\": container with ID starting with 6d151c94469dfc66c0cf090644d6372625cd1fd9613ef531ce1a992452cb6f74 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.582816 4681 scope.go:117] "RemoveContainer" containerID="61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.583068 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc"} err="failed to get container status \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": rpc error: code = NotFound desc = could not find container \"61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc\": container with ID starting with 61c16c4a5e14bb060b65373c3aff1f7d2cc3478f4878a8b3ef0e8cdc2fd2aecc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.583089 4681 scope.go:117] "RemoveContainer" containerID="5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.583361 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc"} err="failed to get container status \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": rpc error: code = NotFound desc = could not find container \"5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc\": container with ID starting with 5ab8f7f31e90de59f5b7683ed8ed91067c914864d88137a77dbccb534cc6fedc not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.583382 4681 scope.go:117] "RemoveContainer" containerID="aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.583697 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81"} err="failed to get container status \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": rpc error: code = NotFound desc = could not find container \"aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81\": container with ID starting with aa83a2b64111541e89c74eada503fca5a59edb502ba7ca64fee44ee44f285e81 not found: ID does not exist" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630335 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630388 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630447 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2j72\" (UniqueName: \"kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630548 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630581 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630647 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630671 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.630740 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd\") pod \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\" (UID: \"cfc5c9d4-2144-461c-b1ff-9061cb201ad6\") " Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.631544 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.631717 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.635128 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts" (OuterVolumeSpecName: "scripts") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.647861 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72" (OuterVolumeSpecName: "kube-api-access-v2j72") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "kube-api-access-v2j72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.661660 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.683093 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.699821 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.730605 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data" (OuterVolumeSpecName: "config-data") pod "cfc5c9d4-2144-461c-b1ff-9061cb201ad6" (UID: "cfc5c9d4-2144-461c-b1ff-9061cb201ad6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732229 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732264 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732275 4681 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732284 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732293 4681 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732302 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732310 4681 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.732318 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2j72\" (UniqueName: \"kubernetes.io/projected/cfc5c9d4-2144-461c-b1ff-9061cb201ad6-kube-api-access-v2j72\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.842503 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.852517 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859269 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.859638 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="proxy-httpd" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859656 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="proxy-httpd" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.859672 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-central-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859677 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-central-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.859712 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-notification-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859718 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-notification-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: E1007 17:24:34.859729 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="sg-core" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859735 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="sg-core" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859921 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-central-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859934 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="proxy-httpd" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859946 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="ceilometer-notification-agent" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.859955 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" containerName="sg-core" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.861527 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.864822 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.865167 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.865832 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 17:24:34 crc kubenswrapper[4681]: I1007 17:24:34.887648 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.038711 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.038895 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.038965 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.039017 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25xm\" (UniqueName: \"kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.039183 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.039251 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.039340 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.039363 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.040105 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc5c9d4-2144-461c-b1ff-9061cb201ad6" path="/var/lib/kubelet/pods/cfc5c9d4-2144-461c-b1ff-9061cb201ad6/volumes" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.140759 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.140818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.140895 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.140915 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.140967 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.141022 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.141047 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.141069 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25xm\" (UniqueName: \"kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.142830 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.143191 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.153360 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.156397 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.156777 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.156821 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.160329 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.160525 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25xm\" (UniqueName: \"kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm\") pod \"ceilometer-0\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.234389 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.506976 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef096ee9-933c-44da-a4b7-6cc5b62ecc49","Type":"ContainerStarted","Data":"a8fe7ba3dfe27964eddcfb7679f15480f5060905338a0d0d404056987f1d104c"} Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.508442 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.542368 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.542351612 podStartE2EDuration="2.542351612s" podCreationTimestamp="2025-10-07 17:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:35.536043077 +0000 UTC m=+1279.183454632" watchObservedRunningTime="2025-10-07 17:24:35.542351612 +0000 UTC m=+1279.189763167" Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.736069 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:24:35 crc kubenswrapper[4681]: W1007 17:24:35.738901 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8039090b_be20_41f5_8135_afb87372db43.slice/crio-024e58a2c594e4f915c999192610867c46812b0711878ca70803bc684b2f717b WatchSource:0}: Error finding container 024e58a2c594e4f915c999192610867c46812b0711878ca70803bc684b2f717b: Status 404 returned error can't find the container with id 024e58a2c594e4f915c999192610867c46812b0711878ca70803bc684b2f717b Oct 07 17:24:35 crc kubenswrapper[4681]: I1007 17:24:35.741622 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:24:36 crc kubenswrapper[4681]: I1007 17:24:36.526896 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerStarted","Data":"024e58a2c594e4f915c999192610867c46812b0711878ca70803bc684b2f717b"} Oct 07 17:24:37 crc kubenswrapper[4681]: I1007 17:24:37.538627 4681 generic.go:334] "Generic (PLEG): container finished" podID="990e1913-44d7-414b-a116-6b712547fc81" containerID="af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db" exitCode=137 Oct 07 17:24:37 crc kubenswrapper[4681]: I1007 17:24:37.538769 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerDied","Data":"af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db"} Oct 07 17:24:37 crc kubenswrapper[4681]: I1007 17:24:37.539106 4681 scope.go:117] "RemoveContainer" containerID="a4aad55b86a935fdd7570b3d62dd77646b0917b7fe0adb2434009ebb8ecfb75b" Oct 07 17:24:37 crc kubenswrapper[4681]: I1007 17:24:37.541390 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerStarted","Data":"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264"} Oct 07 17:24:37 crc kubenswrapper[4681]: I1007 17:24:37.541427 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerStarted","Data":"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307"} Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.552832 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerStarted","Data":"b4889e462f03c208394a02d8c27c149d4669b02ad5367737278bbdc6137dfbb3"} Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.556310 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerStarted","Data":"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996"} Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.559497 4681 generic.go:334] "Generic (PLEG): container finished" podID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerID="a2ef2c60f997a9c728de5a3cb38dc728740b1786f9bd5808e689dbe3f49f3013" exitCode=137 Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.559540 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerDied","Data":"a2ef2c60f997a9c728de5a3cb38dc728740b1786f9bd5808e689dbe3f49f3013"} Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.559574 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f945f854d-hm49c" event={"ID":"02a91326-9285-4589-a05b-c0a2c2ed397e","Type":"ContainerStarted","Data":"845dfb76beed4fae7f06e1965212c9f12fa5128147df5d7e4866d40964ffb7d9"} Oct 07 17:24:38 crc kubenswrapper[4681]: I1007 17:24:38.559602 4681 scope.go:117] "RemoveContainer" containerID="9084625f4c93f3307d3d2fa500d4105766d6a26c88fba8323a56f7e6882db8ed" Oct 07 17:24:39 crc kubenswrapper[4681]: I1007 17:24:39.578533 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerStarted","Data":"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455"} Oct 07 17:24:39 crc kubenswrapper[4681]: I1007 17:24:39.579817 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 17:24:39 crc kubenswrapper[4681]: I1007 17:24:39.620358 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.252022004 podStartE2EDuration="5.620341093s" podCreationTimestamp="2025-10-07 17:24:34 +0000 UTC" firstStartedPulling="2025-10-07 17:24:35.741154101 +0000 UTC m=+1279.388565656" lastFinishedPulling="2025-10-07 17:24:39.10947319 +0000 UTC m=+1282.756884745" observedRunningTime="2025-10-07 17:24:39.612597257 +0000 UTC m=+1283.260008812" watchObservedRunningTime="2025-10-07 17:24:39.620341093 +0000 UTC m=+1283.267752648" Oct 07 17:24:41 crc kubenswrapper[4681]: I1007 17:24:41.153296 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 17:24:42 crc kubenswrapper[4681]: I1007 17:24:42.194725 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:24:42 crc kubenswrapper[4681]: I1007 17:24:42.195044 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:24:43 crc kubenswrapper[4681]: I1007 17:24:43.898095 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.378985 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kzhqh"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.380131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.386550 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.390900 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.392839 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kzhqh"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.512295 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rnl\" (UniqueName: \"kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.512476 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.512568 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.512599 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.572864 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.574376 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.582443 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.591000 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.615358 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.615447 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.615474 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.615531 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rnl\" (UniqueName: \"kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.622336 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.625793 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.658679 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.668671 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rnl\" (UniqueName: \"kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl\") pod \"nova-cell0-cell-mapping-kzhqh\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.705497 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.717685 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cksb\" (UniqueName: \"kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.717733 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.717757 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.717852 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.740394 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.741807 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.764183 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.765600 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823034 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cksb\" (UniqueName: \"kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823096 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823126 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823200 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823225 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823309 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823351 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwtft\" (UniqueName: \"kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.823387 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.824707 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.843782 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.845951 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.870815 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cksb\" (UniqueName: \"kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb\") pod \"nova-api-0\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.875266 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.876419 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.880999 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.895174 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.895796 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.906286 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.908107 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.924970 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.925020 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.925093 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwtft\" (UniqueName: \"kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.925122 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.929339 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.932710 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.971673 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:44 crc kubenswrapper[4681]: I1007 17:24:44.997661 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.002977 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwtft\" (UniqueName: \"kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft\") pod \"nova-metadata-0\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " pod="openstack/nova-metadata-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.031993 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032072 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw72j\" (UniqueName: \"kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032130 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032171 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032216 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp9h\" (UniqueName: \"kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032285 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032313 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.032357 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.055774 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.057184 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.060295 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.061497 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.134691 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.134778 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135166 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw72j\" (UniqueName: \"kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135202 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135269 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135316 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135359 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135399 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp9h\" (UniqueName: \"kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135432 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135451 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135512 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.135564 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9r8g\" (UniqueName: \"kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.136821 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.136980 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.137077 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.137199 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.137612 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.141985 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.164935 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw72j\" (UniqueName: \"kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.165825 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data\") pod \"nova-scheduler-0\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.169061 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp9h\" (UniqueName: \"kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h\") pod \"dnsmasq-dns-bccf8f775-mcftl\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.217860 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.238067 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9r8g\" (UniqueName: \"kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.238126 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.238171 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.238546 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.246418 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.246851 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.261302 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9r8g\" (UniqueName: \"kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g\") pod \"nova-cell1-novncproxy-0\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.283304 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.383285 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.495463 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.607579 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kzhqh"] Oct 07 17:24:45 crc kubenswrapper[4681]: W1007 17:24:45.626812 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42d37201_5b36_4972_a378_7e20139e4731.slice/crio-d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a WatchSource:0}: Error finding container d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a: Status 404 returned error can't find the container with id d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.671110 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerStarted","Data":"bb914a2a6c9b0d3203c68c186b5fe7f7d731fae0b5f3248992cb244c1a1776d6"} Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.671993 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kzhqh" event={"ID":"42d37201-5b36-4972-a378-7e20139e4731","Type":"ContainerStarted","Data":"d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a"} Oct 07 17:24:45 crc kubenswrapper[4681]: I1007 17:24:45.867330 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.139816 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.186499 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:24:46 crc kubenswrapper[4681]: W1007 17:24:46.221331 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13cc700b_7284_4106_aa2c_3d83ef58b00a.slice/crio-823764cd564594629f8b01391ac91eb972bec2648f1fad1ed6ed91aa682732e9 WatchSource:0}: Error finding container 823764cd564594629f8b01391ac91eb972bec2648f1fad1ed6ed91aa682732e9: Status 404 returned error can't find the container with id 823764cd564594629f8b01391ac91eb972bec2648f1fad1ed6ed91aa682732e9 Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.442356 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:24:46 crc kubenswrapper[4681]: W1007 17:24:46.447745 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39ccace7_9bc9_426e_a9df_a5d58dbe5aa1.slice/crio-28cdff51c4d5911478ff12d6ba6ec0ece83fa0057d976b3b76ac0192b911a89e WatchSource:0}: Error finding container 28cdff51c4d5911478ff12d6ba6ec0ece83fa0057d976b3b76ac0192b911a89e: Status 404 returned error can't find the container with id 28cdff51c4d5911478ff12d6ba6ec0ece83fa0057d976b3b76ac0192b911a89e Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.704500 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerStarted","Data":"d240bf962a7b81b1551dfbe97931d983e97e348485ca8ce87aa45e1eb3589f9f"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.704806 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerStarted","Data":"823764cd564594629f8b01391ac91eb972bec2648f1fad1ed6ed91aa682732e9"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.713915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84994a7f-c636-4c7d-9b95-6f3b6eda305e","Type":"ContainerStarted","Data":"5a9c971d60efb6b558e005de291d143a3f671dcafa8433a5e6d4f16e05c206ef"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.721403 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1","Type":"ContainerStarted","Data":"28cdff51c4d5911478ff12d6ba6ec0ece83fa0057d976b3b76ac0192b911a89e"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.730060 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerStarted","Data":"00c6722fb852df422417a1f190941fb4113cd5e559876bb8c2172dddfb37aaf6"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.737753 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kzhqh" event={"ID":"42d37201-5b36-4972-a378-7e20139e4731","Type":"ContainerStarted","Data":"fd4a6b3e938118ec75880ea0e489dff7d2fe5084b9dd35cd3a002f7db3f74d1c"} Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.766782 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kzhqh" podStartSLOduration=2.7667594170000003 podStartE2EDuration="2.766759417s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:46.762329924 +0000 UTC m=+1290.409741469" watchObservedRunningTime="2025-10-07 17:24:46.766759417 +0000 UTC m=+1290.414171132" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.862269 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bw9sz"] Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.864119 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.867566 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.867901 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.900306 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bw9sz"] Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.958986 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.959131 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nddkf\" (UniqueName: \"kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.959425 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:46 crc kubenswrapper[4681]: I1007 17:24:46.959469 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.062337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.070560 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.071308 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.071597 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nddkf\" (UniqueName: \"kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.079742 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.089129 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.094552 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nddkf\" (UniqueName: \"kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.103919 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts\") pod \"nova-cell1-conductor-db-sync-bw9sz\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.195130 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.441099 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.441562 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.443069 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.540617 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bw9sz"] Oct 07 17:24:47 crc kubenswrapper[4681]: W1007 17:24:47.564484 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbd85044_6828_4eb3_89df_b7efcd333c6a.slice/crio-fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1 WatchSource:0}: Error finding container fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1: Status 404 returned error can't find the container with id fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1 Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.618520 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.618796 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.619774 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.777841 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" event={"ID":"bbd85044-6828-4eb3-89df-b7efcd333c6a","Type":"ContainerStarted","Data":"fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1"} Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.783971 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerDied","Data":"d240bf962a7b81b1551dfbe97931d983e97e348485ca8ce87aa45e1eb3589f9f"} Oct 07 17:24:47 crc kubenswrapper[4681]: I1007 17:24:47.783281 4681 generic.go:334] "Generic (PLEG): container finished" podID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerID="d240bf962a7b81b1551dfbe97931d983e97e348485ca8ce87aa45e1eb3589f9f" exitCode=0 Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.871694 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" event={"ID":"bbd85044-6828-4eb3-89df-b7efcd333c6a","Type":"ContainerStarted","Data":"473b56e8b1ae81160a230176445922e3c0bd2a0220f9cd450022e93dd99fb60b"} Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.881934 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerStarted","Data":"20cef2b951410633979fbe0d3d90b743a2f268a7f89f7f02c5efcb529693f4a8"} Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.882691 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.900411 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.915725 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" podStartSLOduration=2.915697034 podStartE2EDuration="2.915697034s" podCreationTimestamp="2025-10-07 17:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:48.898712201 +0000 UTC m=+1292.546123746" watchObservedRunningTime="2025-10-07 17:24:48.915697034 +0000 UTC m=+1292.563108589" Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.938523 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:48 crc kubenswrapper[4681]: I1007 17:24:48.951265 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" podStartSLOduration=4.951246975 podStartE2EDuration="4.951246975s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:48.923010298 +0000 UTC m=+1292.570421853" watchObservedRunningTime="2025-10-07 17:24:48.951246975 +0000 UTC m=+1292.598658530" Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.912175 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84994a7f-c636-4c7d-9b95-6f3b6eda305e","Type":"ContainerStarted","Data":"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.915339 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1","Type":"ContainerStarted","Data":"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.915399 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711" gracePeriod=30 Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.920016 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerStarted","Data":"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.920052 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerStarted","Data":"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.922494 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerStarted","Data":"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.922523 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerStarted","Data":"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea"} Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.922685 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-log" containerID="cri-o://51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" gracePeriod=30 Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.922987 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-metadata" containerID="cri-o://d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" gracePeriod=30 Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.931643 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.113509087 podStartE2EDuration="7.931624606s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="2025-10-07 17:24:46.17606933 +0000 UTC m=+1289.823480885" lastFinishedPulling="2025-10-07 17:24:50.994184849 +0000 UTC m=+1294.641596404" observedRunningTime="2025-10-07 17:24:51.927346076 +0000 UTC m=+1295.574757631" watchObservedRunningTime="2025-10-07 17:24:51.931624606 +0000 UTC m=+1295.579036161" Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.959031 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.495990014 podStartE2EDuration="7.9590167s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="2025-10-07 17:24:45.533512309 +0000 UTC m=+1289.180923864" lastFinishedPulling="2025-10-07 17:24:50.996538995 +0000 UTC m=+1294.643950550" observedRunningTime="2025-10-07 17:24:51.954125043 +0000 UTC m=+1295.601536618" watchObservedRunningTime="2025-10-07 17:24:51.9590167 +0000 UTC m=+1295.606428255" Oct 07 17:24:51 crc kubenswrapper[4681]: I1007 17:24:51.992319 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.991198569 podStartE2EDuration="7.992278475s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="2025-10-07 17:24:45.986991303 +0000 UTC m=+1289.634402858" lastFinishedPulling="2025-10-07 17:24:50.988071209 +0000 UTC m=+1294.635482764" observedRunningTime="2025-10-07 17:24:51.976242919 +0000 UTC m=+1295.623654474" watchObservedRunningTime="2025-10-07 17:24:51.992278475 +0000 UTC m=+1295.639690030" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.004165 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.461579784 podStartE2EDuration="8.004146407s" podCreationTimestamp="2025-10-07 17:24:44 +0000 UTC" firstStartedPulling="2025-10-07 17:24:46.450083324 +0000 UTC m=+1290.097494879" lastFinishedPulling="2025-10-07 17:24:50.992649947 +0000 UTC m=+1294.640061502" observedRunningTime="2025-10-07 17:24:51.997635325 +0000 UTC m=+1295.645046870" watchObservedRunningTime="2025-10-07 17:24:52.004146407 +0000 UTC m=+1295.651557962" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.890718 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.935275 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerID="d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" exitCode=0 Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.935311 4681 generic.go:334] "Generic (PLEG): container finished" podID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerID="51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" exitCode=143 Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.936238 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.936702 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerDied","Data":"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686"} Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.936733 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerDied","Data":"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea"} Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.936744 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfc39dfe-0be6-4936-94d8-17c9c96759c4","Type":"ContainerDied","Data":"00c6722fb852df422417a1f190941fb4113cd5e559876bb8c2172dddfb37aaf6"} Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.936758 4681 scope.go:117] "RemoveContainer" containerID="d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.962163 4681 scope.go:117] "RemoveContainer" containerID="51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.992128 4681 scope.go:117] "RemoveContainer" containerID="d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" Oct 07 17:24:52 crc kubenswrapper[4681]: E1007 17:24:52.993416 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686\": container with ID starting with d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686 not found: ID does not exist" containerID="d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993449 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686"} err="failed to get container status \"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686\": rpc error: code = NotFound desc = could not find container \"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686\": container with ID starting with d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686 not found: ID does not exist" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993469 4681 scope.go:117] "RemoveContainer" containerID="51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" Oct 07 17:24:52 crc kubenswrapper[4681]: E1007 17:24:52.993698 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea\": container with ID starting with 51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea not found: ID does not exist" containerID="51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993720 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea"} err="failed to get container status \"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea\": rpc error: code = NotFound desc = could not find container \"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea\": container with ID starting with 51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea not found: ID does not exist" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993734 4681 scope.go:117] "RemoveContainer" containerID="d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993972 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686"} err="failed to get container status \"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686\": rpc error: code = NotFound desc = could not find container \"d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686\": container with ID starting with d27b8ed7b4e9c4d33164b607a137a20bd47dba895611c27c087215a7194fc686 not found: ID does not exist" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.993989 4681 scope.go:117] "RemoveContainer" containerID="51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea" Oct 07 17:24:52 crc kubenswrapper[4681]: I1007 17:24:52.994195 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea"} err="failed to get container status \"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea\": rpc error: code = NotFound desc = could not find container \"51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea\": container with ID starting with 51cd817e21af3a8a7a6d5988a8c922bcfbddf6cc04da6330c1b8e7cbc90f11ea not found: ID does not exist" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.011894 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data\") pod \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.012009 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle\") pod \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.012054 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwtft\" (UniqueName: \"kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft\") pod \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.012192 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs\") pod \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\" (UID: \"cfc39dfe-0be6-4936-94d8-17c9c96759c4\") " Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.013569 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs" (OuterVolumeSpecName: "logs") pod "cfc39dfe-0be6-4936-94d8-17c9c96759c4" (UID: "cfc39dfe-0be6-4936-94d8-17c9c96759c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.019293 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft" (OuterVolumeSpecName: "kube-api-access-wwtft") pod "cfc39dfe-0be6-4936-94d8-17c9c96759c4" (UID: "cfc39dfe-0be6-4936-94d8-17c9c96759c4"). InnerVolumeSpecName "kube-api-access-wwtft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.097991 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data" (OuterVolumeSpecName: "config-data") pod "cfc39dfe-0be6-4936-94d8-17c9c96759c4" (UID: "cfc39dfe-0be6-4936-94d8-17c9c96759c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.113109 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfc39dfe-0be6-4936-94d8-17c9c96759c4" (UID: "cfc39dfe-0be6-4936-94d8-17c9c96759c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.118111 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.118148 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwtft\" (UniqueName: \"kubernetes.io/projected/cfc39dfe-0be6-4936-94d8-17c9c96759c4-kube-api-access-wwtft\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.118161 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfc39dfe-0be6-4936-94d8-17c9c96759c4-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.118169 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfc39dfe-0be6-4936-94d8-17c9c96759c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.270930 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.279623 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.316785 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:53 crc kubenswrapper[4681]: E1007 17:24:53.317154 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-metadata" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.317171 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-metadata" Oct 07 17:24:53 crc kubenswrapper[4681]: E1007 17:24:53.317185 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-log" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.317191 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-log" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.317388 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-metadata" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.317407 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" containerName="nova-metadata-log" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.318353 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.326444 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.326767 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.336432 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.434500 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.434605 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.434690 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nnvk\" (UniqueName: \"kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.434715 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.434730 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.535648 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nnvk\" (UniqueName: \"kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.535691 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.535710 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.535740 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.535818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.536271 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.539191 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.540333 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.552444 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nnvk\" (UniqueName: \"kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.553404 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data\") pod \"nova-metadata-0\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " pod="openstack/nova-metadata-0" Oct 07 17:24:53 crc kubenswrapper[4681]: I1007 17:24:53.654707 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.115626 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.896107 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.896422 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.980178 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerStarted","Data":"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367"} Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.980221 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerStarted","Data":"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452"} Oct 07 17:24:54 crc kubenswrapper[4681]: I1007 17:24:54.980242 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerStarted","Data":"d8c1e584d423de94119c461c222eb68f8deb8adc689bdd5a7e1e2e02139b5a8b"} Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.014519 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.014505412 podStartE2EDuration="2.014505412s" podCreationTimestamp="2025-10-07 17:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:24:55.012319322 +0000 UTC m=+1298.659730877" watchObservedRunningTime="2025-10-07 17:24:55.014505412 +0000 UTC m=+1298.661916967" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.038710 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc39dfe-0be6-4936-94d8-17c9c96759c4" path="/var/lib/kubelet/pods/cfc39dfe-0be6-4936-94d8-17c9c96759c4/volumes" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.239105 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.239405 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.281153 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.285107 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.385393 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.399248 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.399482 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="dnsmasq-dns" containerID="cri-o://eb3685b49e8b00675ad0765596fdbdbd7b0d3c9265cfa5ffccd03c0e3023be48" gracePeriod=10 Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.992145 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:24:55 crc kubenswrapper[4681]: I1007 17:24:55.992611 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.025096 4681 generic.go:334] "Generic (PLEG): container finished" podID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerID="eb3685b49e8b00675ad0765596fdbdbd7b0d3c9265cfa5ffccd03c0e3023be48" exitCode=0 Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.025979 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" event={"ID":"01926d51-8e89-44e0-8032-7a701b7fcb92","Type":"ContainerDied","Data":"eb3685b49e8b00675ad0765596fdbdbd7b0d3c9265cfa5ffccd03c0e3023be48"} Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.053333 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.199250 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.213832 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.213873 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.213933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrbl7\" (UniqueName: \"kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.213965 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.214026 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.214081 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0\") pod \"01926d51-8e89-44e0-8032-7a701b7fcb92\" (UID: \"01926d51-8e89-44e0-8032-7a701b7fcb92\") " Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.247354 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7" (OuterVolumeSpecName: "kube-api-access-lrbl7") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "kube-api-access-lrbl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.295632 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.316051 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.316082 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrbl7\" (UniqueName: \"kubernetes.io/projected/01926d51-8e89-44e0-8032-7a701b7fcb92-kube-api-access-lrbl7\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.354994 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.360640 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config" (OuterVolumeSpecName: "config") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.377162 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.400457 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01926d51-8e89-44e0-8032-7a701b7fcb92" (UID: "01926d51-8e89-44e0-8032-7a701b7fcb92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.418179 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.418456 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.418583 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:56 crc kubenswrapper[4681]: I1007 17:24:56.418757 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01926d51-8e89-44e0-8032-7a701b7fcb92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.047567 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.052989 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-txwrt" event={"ID":"01926d51-8e89-44e0-8032-7a701b7fcb92","Type":"ContainerDied","Data":"f339039a340a591bbed2cfee53a5ab906b4e2e8d7f52b91c225648e496f60e35"} Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.053097 4681 scope.go:117] "RemoveContainer" containerID="eb3685b49e8b00675ad0765596fdbdbd7b0d3c9265cfa5ffccd03c0e3023be48" Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.131140 4681 scope.go:117] "RemoveContainer" containerID="d38e7941d43d36d6a81b8bd886fbe48613c61b6aece459831545f7a6d423f482" Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.152307 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.162748 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-txwrt"] Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.441272 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:24:57 crc kubenswrapper[4681]: I1007 17:24:57.618504 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f945f854d-hm49c" podUID="02a91326-9285-4589-a05b-c0a2c2ed397e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.056198 4681 generic.go:334] "Generic (PLEG): container finished" podID="bbd85044-6828-4eb3-89df-b7efcd333c6a" containerID="473b56e8b1ae81160a230176445922e3c0bd2a0220f9cd450022e93dd99fb60b" exitCode=0 Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.056263 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" event={"ID":"bbd85044-6828-4eb3-89df-b7efcd333c6a","Type":"ContainerDied","Data":"473b56e8b1ae81160a230176445922e3c0bd2a0220f9cd450022e93dd99fb60b"} Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.058170 4681 generic.go:334] "Generic (PLEG): container finished" podID="42d37201-5b36-4972-a378-7e20139e4731" containerID="fd4a6b3e938118ec75880ea0e489dff7d2fe5084b9dd35cd3a002f7db3f74d1c" exitCode=0 Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.058239 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kzhqh" event={"ID":"42d37201-5b36-4972-a378-7e20139e4731","Type":"ContainerDied","Data":"fd4a6b3e938118ec75880ea0e489dff7d2fe5084b9dd35cd3a002f7db3f74d1c"} Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.655359 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:24:58 crc kubenswrapper[4681]: I1007 17:24:58.656145 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.038671 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" path="/var/lib/kubelet/pods/01926d51-8e89-44e0-8032-7a701b7fcb92/volumes" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.626378 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.697010 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2rnl\" (UniqueName: \"kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl\") pod \"42d37201-5b36-4972-a378-7e20139e4731\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.698241 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle\") pod \"42d37201-5b36-4972-a378-7e20139e4731\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.698563 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts\") pod \"42d37201-5b36-4972-a378-7e20139e4731\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.698649 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data\") pod \"42d37201-5b36-4972-a378-7e20139e4731\" (UID: \"42d37201-5b36-4972-a378-7e20139e4731\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.702890 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts" (OuterVolumeSpecName: "scripts") pod "42d37201-5b36-4972-a378-7e20139e4731" (UID: "42d37201-5b36-4972-a378-7e20139e4731"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.704071 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl" (OuterVolumeSpecName: "kube-api-access-z2rnl") pod "42d37201-5b36-4972-a378-7e20139e4731" (UID: "42d37201-5b36-4972-a378-7e20139e4731"). InnerVolumeSpecName "kube-api-access-z2rnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.746914 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data" (OuterVolumeSpecName: "config-data") pod "42d37201-5b36-4972-a378-7e20139e4731" (UID: "42d37201-5b36-4972-a378-7e20139e4731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.747391 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42d37201-5b36-4972-a378-7e20139e4731" (UID: "42d37201-5b36-4972-a378-7e20139e4731"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.801632 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.801670 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2rnl\" (UniqueName: \"kubernetes.io/projected/42d37201-5b36-4972-a378-7e20139e4731-kube-api-access-z2rnl\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.801681 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.801691 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d37201-5b36-4972-a378-7e20139e4731-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.826946 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.902903 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts\") pod \"bbd85044-6828-4eb3-89df-b7efcd333c6a\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.903043 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle\") pod \"bbd85044-6828-4eb3-89df-b7efcd333c6a\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.903189 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nddkf\" (UniqueName: \"kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf\") pod \"bbd85044-6828-4eb3-89df-b7efcd333c6a\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.903228 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data\") pod \"bbd85044-6828-4eb3-89df-b7efcd333c6a\" (UID: \"bbd85044-6828-4eb3-89df-b7efcd333c6a\") " Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.908278 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts" (OuterVolumeSpecName: "scripts") pod "bbd85044-6828-4eb3-89df-b7efcd333c6a" (UID: "bbd85044-6828-4eb3-89df-b7efcd333c6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.908911 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf" (OuterVolumeSpecName: "kube-api-access-nddkf") pod "bbd85044-6828-4eb3-89df-b7efcd333c6a" (UID: "bbd85044-6828-4eb3-89df-b7efcd333c6a"). InnerVolumeSpecName "kube-api-access-nddkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.935207 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data" (OuterVolumeSpecName: "config-data") pod "bbd85044-6828-4eb3-89df-b7efcd333c6a" (UID: "bbd85044-6828-4eb3-89df-b7efcd333c6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:24:59 crc kubenswrapper[4681]: I1007 17:24:59.940493 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbd85044-6828-4eb3-89df-b7efcd333c6a" (UID: "bbd85044-6828-4eb3-89df-b7efcd333c6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.006208 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nddkf\" (UniqueName: \"kubernetes.io/projected/bbd85044-6828-4eb3-89df-b7efcd333c6a-kube-api-access-nddkf\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.006243 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.006256 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.006265 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd85044-6828-4eb3-89df-b7efcd333c6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.079590 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" event={"ID":"bbd85044-6828-4eb3-89df-b7efcd333c6a","Type":"ContainerDied","Data":"fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1"} Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.079799 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc39ff4f2aacf9819231870fcea5d2efc3d5f4edec48ef28cff40bcf7897b2a1" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.081100 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kzhqh" event={"ID":"42d37201-5b36-4972-a378-7e20139e4731","Type":"ContainerDied","Data":"d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a"} Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.081195 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22df61524673c9b9139efbc0a6cc42b4f7e47252f32d7e9eef5910f34f8c66a" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.081310 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bw9sz" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.081387 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kzhqh" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.185531 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 17:25:00 crc kubenswrapper[4681]: E1007 17:25:00.189941 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d37201-5b36-4972-a378-7e20139e4731" containerName="nova-manage" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.189970 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d37201-5b36-4972-a378-7e20139e4731" containerName="nova-manage" Oct 07 17:25:00 crc kubenswrapper[4681]: E1007 17:25:00.189997 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd85044-6828-4eb3-89df-b7efcd333c6a" containerName="nova-cell1-conductor-db-sync" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190007 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd85044-6828-4eb3-89df-b7efcd333c6a" containerName="nova-cell1-conductor-db-sync" Oct 07 17:25:00 crc kubenswrapper[4681]: E1007 17:25:00.190286 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="dnsmasq-dns" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190301 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="dnsmasq-dns" Oct 07 17:25:00 crc kubenswrapper[4681]: E1007 17:25:00.190312 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="init" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190319 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="init" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190535 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d37201-5b36-4972-a378-7e20139e4731" containerName="nova-manage" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190561 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="01926d51-8e89-44e0-8032-7a701b7fcb92" containerName="dnsmasq-dns" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.190572 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd85044-6828-4eb3-89df-b7efcd333c6a" containerName="nova-cell1-conductor-db-sync" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.192860 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.195413 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.207039 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.301275 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.301568 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" containerName="nova-scheduler-scheduler" containerID="cri-o://b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb" gracePeriod=30 Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.312981 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.313095 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.313115 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nlrj\" (UniqueName: \"kubernetes.io/projected/a1765198-be66-424a-b57a-187a6b62c4bc-kube-api-access-9nlrj\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.323255 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.323694 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-log" containerID="cri-o://6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e" gracePeriod=30 Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.323914 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-api" containerID="cri-o://5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2" gracePeriod=30 Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.339174 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.339432 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-log" containerID="cri-o://f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" gracePeriod=30 Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.339573 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-metadata" containerID="cri-o://4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" gracePeriod=30 Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.415143 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.415184 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nlrj\" (UniqueName: \"kubernetes.io/projected/a1765198-be66-424a-b57a-187a6b62c4bc-kube-api-access-9nlrj\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.415306 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.421113 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.421268 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1765198-be66-424a-b57a-187a6b62c4bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.435050 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nlrj\" (UniqueName: \"kubernetes.io/projected/a1765198-be66-424a-b57a-187a6b62c4bc-kube-api-access-9nlrj\") pod \"nova-cell1-conductor-0\" (UID: \"a1765198-be66-424a-b57a-187a6b62c4bc\") " pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:00 crc kubenswrapper[4681]: I1007 17:25:00.521421 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.099365 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.161941 4681 generic.go:334] "Generic (PLEG): container finished" podID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerID="4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" exitCode=0 Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.161988 4681 generic.go:334] "Generic (PLEG): container finished" podID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerID="f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" exitCode=143 Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.162058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerDied","Data":"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367"} Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.162089 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerDied","Data":"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452"} Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.162101 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79578193-6cc5-48a4-a13c-c387bebdea0e","Type":"ContainerDied","Data":"d8c1e584d423de94119c461c222eb68f8deb8adc689bdd5a7e1e2e02139b5a8b"} Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.162120 4681 scope.go:117] "RemoveContainer" containerID="4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.162361 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.187846 4681 generic.go:334] "Generic (PLEG): container finished" podID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerID="6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e" exitCode=143 Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.187989 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerDied","Data":"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e"} Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.216051 4681 scope.go:117] "RemoveContainer" containerID="f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.226965 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.234126 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs\") pod \"79578193-6cc5-48a4-a13c-c387bebdea0e\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.234715 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle\") pod \"79578193-6cc5-48a4-a13c-c387bebdea0e\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.235458 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs\") pod \"79578193-6cc5-48a4-a13c-c387bebdea0e\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.235590 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nnvk\" (UniqueName: \"kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk\") pod \"79578193-6cc5-48a4-a13c-c387bebdea0e\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.235749 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data\") pod \"79578193-6cc5-48a4-a13c-c387bebdea0e\" (UID: \"79578193-6cc5-48a4-a13c-c387bebdea0e\") " Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.236857 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs" (OuterVolumeSpecName: "logs") pod "79578193-6cc5-48a4-a13c-c387bebdea0e" (UID: "79578193-6cc5-48a4-a13c-c387bebdea0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.237579 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79578193-6cc5-48a4-a13c-c387bebdea0e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.258690 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk" (OuterVolumeSpecName: "kube-api-access-6nnvk") pod "79578193-6cc5-48a4-a13c-c387bebdea0e" (UID: "79578193-6cc5-48a4-a13c-c387bebdea0e"). InnerVolumeSpecName "kube-api-access-6nnvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.294204 4681 scope.go:117] "RemoveContainer" containerID="4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.294281 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79578193-6cc5-48a4-a13c-c387bebdea0e" (UID: "79578193-6cc5-48a4-a13c-c387bebdea0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:01 crc kubenswrapper[4681]: E1007 17:25:01.295988 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367\": container with ID starting with 4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367 not found: ID does not exist" containerID="4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.296038 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367"} err="failed to get container status \"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367\": rpc error: code = NotFound desc = could not find container \"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367\": container with ID starting with 4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367 not found: ID does not exist" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.296065 4681 scope.go:117] "RemoveContainer" containerID="f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" Oct 07 17:25:01 crc kubenswrapper[4681]: E1007 17:25:01.296560 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452\": container with ID starting with f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452 not found: ID does not exist" containerID="f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.296671 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452"} err="failed to get container status \"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452\": rpc error: code = NotFound desc = could not find container \"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452\": container with ID starting with f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452 not found: ID does not exist" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.296760 4681 scope.go:117] "RemoveContainer" containerID="4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.300279 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367"} err="failed to get container status \"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367\": rpc error: code = NotFound desc = could not find container \"4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367\": container with ID starting with 4b39dbac53034ebfd8c675ada7387fa0a2642dafc91bcdda5678876cefc4c367 not found: ID does not exist" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.300380 4681 scope.go:117] "RemoveContainer" containerID="f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.300727 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452"} err="failed to get container status \"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452\": rpc error: code = NotFound desc = could not find container \"f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452\": container with ID starting with f6f3a1a63673ff6abe5582b8c59b7558af6cade16f35eedccdf3e51c3bb07452 not found: ID does not exist" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.301851 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data" (OuterVolumeSpecName: "config-data") pod "79578193-6cc5-48a4-a13c-c387bebdea0e" (UID: "79578193-6cc5-48a4-a13c-c387bebdea0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.335479 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "79578193-6cc5-48a4-a13c-c387bebdea0e" (UID: "79578193-6cc5-48a4-a13c-c387bebdea0e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.342141 4681 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.342183 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.342197 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nnvk\" (UniqueName: \"kubernetes.io/projected/79578193-6cc5-48a4-a13c-c387bebdea0e-kube-api-access-6nnvk\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.342212 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79578193-6cc5-48a4-a13c-c387bebdea0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.505033 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.524123 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.534281 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:01 crc kubenswrapper[4681]: E1007 17:25:01.534726 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-log" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.534745 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-log" Oct 07 17:25:01 crc kubenswrapper[4681]: E1007 17:25:01.534771 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-metadata" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.534777 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-metadata" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.535022 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-metadata" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.535046 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" containerName="nova-metadata-log" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.536263 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.539126 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.539558 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.573170 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.646981 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.647083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.647139 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.647160 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2cg\" (UniqueName: \"kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.647328 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.748908 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.748953 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2cg\" (UniqueName: \"kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.749002 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.749088 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.749746 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.750426 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.757731 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.757771 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.771614 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.795491 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2cg\" (UniqueName: \"kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg\") pod \"nova-metadata-0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " pod="openstack/nova-metadata-0" Oct 07 17:25:01 crc kubenswrapper[4681]: I1007 17:25:01.928035 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:02 crc kubenswrapper[4681]: I1007 17:25:02.203085 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1765198-be66-424a-b57a-187a6b62c4bc","Type":"ContainerStarted","Data":"d49e1ebb9b8ee9853e8098fdb40b6dbeabc6b3d41477e256e679b7c57a0f53e9"} Oct 07 17:25:02 crc kubenswrapper[4681]: I1007 17:25:02.203423 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1765198-be66-424a-b57a-187a6b62c4bc","Type":"ContainerStarted","Data":"ab38474f9562ab3fd81265d3f3e43c0c02e48e454b0d05b08bf0dd5421402a17"} Oct 07 17:25:02 crc kubenswrapper[4681]: I1007 17:25:02.203684 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:02 crc kubenswrapper[4681]: I1007 17:25:02.223280 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.223260764 podStartE2EDuration="2.223260764s" podCreationTimestamp="2025-10-07 17:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:02.222205554 +0000 UTC m=+1305.869617109" watchObservedRunningTime="2025-10-07 17:25:02.223260764 +0000 UTC m=+1305.870672319" Oct 07 17:25:02 crc kubenswrapper[4681]: I1007 17:25:02.422236 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.039463 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79578193-6cc5-48a4-a13c-c387bebdea0e" path="/var/lib/kubelet/pods/79578193-6cc5-48a4-a13c-c387bebdea0e/volumes" Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.212623 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerStarted","Data":"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06"} Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.213427 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerStarted","Data":"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872"} Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.213443 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerStarted","Data":"6641f546e2c444e5770aa63c51ecb67fbb69b8fb6e823e426acba700f3138f4f"} Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.237837 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.237818319 podStartE2EDuration="2.237818319s" podCreationTimestamp="2025-10-07 17:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:03.235495664 +0000 UTC m=+1306.882907219" watchObservedRunningTime="2025-10-07 17:25:03.237818319 +0000 UTC m=+1306.885229874" Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.879118 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.996474 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs\") pod \"a7996c51-2302-4c44-8b48-c2380e0bbd00\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.997317 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs" (OuterVolumeSpecName: "logs") pod "a7996c51-2302-4c44-8b48-c2380e0bbd00" (UID: "a7996c51-2302-4c44-8b48-c2380e0bbd00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.997402 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data\") pod \"a7996c51-2302-4c44-8b48-c2380e0bbd00\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " Oct 07 17:25:03 crc kubenswrapper[4681]: I1007 17:25:03.999127 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cksb\" (UniqueName: \"kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb\") pod \"a7996c51-2302-4c44-8b48-c2380e0bbd00\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:03.999604 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle\") pod \"a7996c51-2302-4c44-8b48-c2380e0bbd00\" (UID: \"a7996c51-2302-4c44-8b48-c2380e0bbd00\") " Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.000273 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7996c51-2302-4c44-8b48-c2380e0bbd00-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.003562 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb" (OuterVolumeSpecName: "kube-api-access-6cksb") pod "a7996c51-2302-4c44-8b48-c2380e0bbd00" (UID: "a7996c51-2302-4c44-8b48-c2380e0bbd00"). InnerVolumeSpecName "kube-api-access-6cksb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.035082 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data" (OuterVolumeSpecName: "config-data") pod "a7996c51-2302-4c44-8b48-c2380e0bbd00" (UID: "a7996c51-2302-4c44-8b48-c2380e0bbd00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.045269 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7996c51-2302-4c44-8b48-c2380e0bbd00" (UID: "a7996c51-2302-4c44-8b48-c2380e0bbd00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.102330 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.102368 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cksb\" (UniqueName: \"kubernetes.io/projected/a7996c51-2302-4c44-8b48-c2380e0bbd00-kube-api-access-6cksb\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.102377 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7996c51-2302-4c44-8b48-c2380e0bbd00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.223576 4681 generic.go:334] "Generic (PLEG): container finished" podID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerID="5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2" exitCode=0 Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.223629 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.223659 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerDied","Data":"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2"} Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.223744 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7996c51-2302-4c44-8b48-c2380e0bbd00","Type":"ContainerDied","Data":"bb914a2a6c9b0d3203c68c186b5fe7f7d731fae0b5f3248992cb244c1a1776d6"} Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.223791 4681 scope.go:117] "RemoveContainer" containerID="5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.258130 4681 scope.go:117] "RemoveContainer" containerID="6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.265156 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.297959 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.298699 4681 scope.go:117] "RemoveContainer" containerID="5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2" Oct 07 17:25:04 crc kubenswrapper[4681]: E1007 17:25:04.299146 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2\": container with ID starting with 5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2 not found: ID does not exist" containerID="5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.299179 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2"} err="failed to get container status \"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2\": rpc error: code = NotFound desc = could not find container \"5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2\": container with ID starting with 5cfe10fbc70bf03cd65ae986c6b40643100fd4dd469745ce4ce474d58591c3c2 not found: ID does not exist" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.299202 4681 scope.go:117] "RemoveContainer" containerID="6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e" Oct 07 17:25:04 crc kubenswrapper[4681]: E1007 17:25:04.299464 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e\": container with ID starting with 6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e not found: ID does not exist" containerID="6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.299519 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e"} err="failed to get container status \"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e\": rpc error: code = NotFound desc = could not find container \"6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e\": container with ID starting with 6d38e462b329976d009a9862e1fa52dd6bc8773fe4f2f323a2878f30df2dc95e not found: ID does not exist" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.319945 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:04 crc kubenswrapper[4681]: E1007 17:25:04.321442 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-log" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.321485 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-log" Oct 07 17:25:04 crc kubenswrapper[4681]: E1007 17:25:04.321548 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-api" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.321557 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-api" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.337297 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-log" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.337337 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" containerName="nova-api-api" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.338487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.343000 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.355453 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.410157 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.410233 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.410313 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6b6\" (UniqueName: \"kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.410398 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.512269 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6b6\" (UniqueName: \"kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.512726 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.512868 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.513014 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.513100 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.517534 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.517990 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.529383 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6b6\" (UniqueName: \"kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6\") pod \"nova-api-0\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.682371 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:04 crc kubenswrapper[4681]: I1007 17:25:04.904284 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.029516 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle\") pod \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.029719 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data\") pod \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.029778 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw72j\" (UniqueName: \"kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j\") pod \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\" (UID: \"84994a7f-c636-4c7d-9b95-6f3b6eda305e\") " Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.044510 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7996c51-2302-4c44-8b48-c2380e0bbd00" path="/var/lib/kubelet/pods/a7996c51-2302-4c44-8b48-c2380e0bbd00/volumes" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.047170 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j" (OuterVolumeSpecName: "kube-api-access-bw72j") pod "84994a7f-c636-4c7d-9b95-6f3b6eda305e" (UID: "84994a7f-c636-4c7d-9b95-6f3b6eda305e"). InnerVolumeSpecName "kube-api-access-bw72j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.070922 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84994a7f-c636-4c7d-9b95-6f3b6eda305e" (UID: "84994a7f-c636-4c7d-9b95-6f3b6eda305e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.100398 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data" (OuterVolumeSpecName: "config-data") pod "84994a7f-c636-4c7d-9b95-6f3b6eda305e" (UID: "84994a7f-c636-4c7d-9b95-6f3b6eda305e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.131706 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.131751 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84994a7f-c636-4c7d-9b95-6f3b6eda305e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.131765 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw72j\" (UniqueName: \"kubernetes.io/projected/84994a7f-c636-4c7d-9b95-6f3b6eda305e-kube-api-access-bw72j\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.233226 4681 generic.go:334] "Generic (PLEG): container finished" podID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" containerID="b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb" exitCode=0 Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.233283 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.233291 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84994a7f-c636-4c7d-9b95-6f3b6eda305e","Type":"ContainerDied","Data":"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb"} Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.233316 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"84994a7f-c636-4c7d-9b95-6f3b6eda305e","Type":"ContainerDied","Data":"5a9c971d60efb6b558e005de291d143a3f671dcafa8433a5e6d4f16e05c206ef"} Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.233331 4681 scope.go:117] "RemoveContainer" containerID="b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.249972 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.295514 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.314570 4681 scope.go:117] "RemoveContainer" containerID="b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb" Oct 07 17:25:05 crc kubenswrapper[4681]: E1007 17:25:05.315305 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb\": container with ID starting with b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb not found: ID does not exist" containerID="b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.315346 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb"} err="failed to get container status \"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb\": rpc error: code = NotFound desc = could not find container \"b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb\": container with ID starting with b442b30e3cc98966758b856dd0793a9a6b84fedd3d3fd85c7df3d3b5c46969bb not found: ID does not exist" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.332472 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:05 crc kubenswrapper[4681]: W1007 17:25:05.340412 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9a0c104_2595_4646_b20a_7f3975f6874b.slice/crio-51fccbe9c0e3e89d1faccf68be0af5f0862ae1befc186360774c3c10b0d5ec9e WatchSource:0}: Error finding container 51fccbe9c0e3e89d1faccf68be0af5f0862ae1befc186360774c3c10b0d5ec9e: Status 404 returned error can't find the container with id 51fccbe9c0e3e89d1faccf68be0af5f0862ae1befc186360774c3c10b0d5ec9e Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.341968 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.366945 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:05 crc kubenswrapper[4681]: E1007 17:25:05.367432 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" containerName="nova-scheduler-scheduler" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.367526 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" containerName="nova-scheduler-scheduler" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.367830 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" containerName="nova-scheduler-scheduler" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.368590 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.378567 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.391568 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.436309 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.436393 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvz9c\" (UniqueName: \"kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.436455 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.538002 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvz9c\" (UniqueName: \"kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.538368 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.538465 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.554577 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.554650 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.557396 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvz9c\" (UniqueName: \"kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c\") pod \"nova-scheduler-0\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:05 crc kubenswrapper[4681]: I1007 17:25:05.704416 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.246860 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerStarted","Data":"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede"} Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.249630 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerStarted","Data":"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4"} Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.249795 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerStarted","Data":"51fccbe9c0e3e89d1faccf68be0af5f0862ae1befc186360774c3c10b0d5ec9e"} Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.273924 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.273905771 podStartE2EDuration="2.273905771s" podCreationTimestamp="2025-10-07 17:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:06.266862664 +0000 UTC m=+1309.914274219" watchObservedRunningTime="2025-10-07 17:25:06.273905771 +0000 UTC m=+1309.921317326" Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.418467 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.928966 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:25:06 crc kubenswrapper[4681]: I1007 17:25:06.929275 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:25:07 crc kubenswrapper[4681]: I1007 17:25:07.040818 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84994a7f-c636-4c7d-9b95-6f3b6eda305e" path="/var/lib/kubelet/pods/84994a7f-c636-4c7d-9b95-6f3b6eda305e/volumes" Oct 07 17:25:07 crc kubenswrapper[4681]: I1007 17:25:07.262231 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92a58956-0b94-4fcf-85a3-1d185f0e906f","Type":"ContainerStarted","Data":"e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f"} Oct 07 17:25:07 crc kubenswrapper[4681]: I1007 17:25:07.262268 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92a58956-0b94-4fcf-85a3-1d185f0e906f","Type":"ContainerStarted","Data":"b6ccb56153ff7cc2c060aeedfe4eb15fcb101ba8c3f304424c1ffc38bc3101bb"} Oct 07 17:25:07 crc kubenswrapper[4681]: I1007 17:25:07.280834 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.280818012 podStartE2EDuration="2.280818012s" podCreationTimestamp="2025-10-07 17:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:07.279238298 +0000 UTC m=+1310.926649853" watchObservedRunningTime="2025-10-07 17:25:07.280818012 +0000 UTC m=+1310.928229567" Oct 07 17:25:10 crc kubenswrapper[4681]: I1007 17:25:10.417393 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:25:10 crc kubenswrapper[4681]: I1007 17:25:10.457306 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:25:10 crc kubenswrapper[4681]: I1007 17:25:10.548437 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 17:25:10 crc kubenswrapper[4681]: I1007 17:25:10.704761 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 17:25:11 crc kubenswrapper[4681]: I1007 17:25:11.929606 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 17:25:11 crc kubenswrapper[4681]: I1007 17:25:11.929658 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.158798 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.195450 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.195511 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.195556 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.196310 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.196373 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8" gracePeriod=600 Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.262424 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f945f854d-hm49c" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.329358 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.329561 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon-log" containerID="cri-o://f12687af7e3841ca2a53c32deb4a7158e2c0c873f8ce45fbe4d823c0abd5a391" gracePeriod=30 Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.329718 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" containerID="cri-o://b4889e462f03c208394a02d8c27c149d4669b02ad5367737278bbdc6137dfbb3" gracePeriod=30 Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.942039 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:12 crc kubenswrapper[4681]: I1007 17:25:12.942053 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:13 crc kubenswrapper[4681]: I1007 17:25:13.327416 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8" exitCode=0 Oct 07 17:25:13 crc kubenswrapper[4681]: I1007 17:25:13.327505 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8"} Oct 07 17:25:13 crc kubenswrapper[4681]: I1007 17:25:13.327704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0"} Oct 07 17:25:13 crc kubenswrapper[4681]: I1007 17:25:13.327727 4681 scope.go:117] "RemoveContainer" containerID="b8b100182dd665e9c6705ef1fa26e28e1874f69676a8a7de938754edc7de052a" Oct 07 17:25:14 crc kubenswrapper[4681]: I1007 17:25:14.683978 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:25:14 crc kubenswrapper[4681]: I1007 17:25:14.684333 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:25:15 crc kubenswrapper[4681]: I1007 17:25:15.705526 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 17:25:15 crc kubenswrapper[4681]: I1007 17:25:15.732755 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 17:25:15 crc kubenswrapper[4681]: I1007 17:25:15.768036 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:15 crc kubenswrapper[4681]: I1007 17:25:15.768041 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:16 crc kubenswrapper[4681]: I1007 17:25:16.378799 4681 generic.go:334] "Generic (PLEG): container finished" podID="990e1913-44d7-414b-a116-6b712547fc81" containerID="b4889e462f03c208394a02d8c27c149d4669b02ad5367737278bbdc6137dfbb3" exitCode=0 Oct 07 17:25:16 crc kubenswrapper[4681]: I1007 17:25:16.378947 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerDied","Data":"b4889e462f03c208394a02d8c27c149d4669b02ad5367737278bbdc6137dfbb3"} Oct 07 17:25:16 crc kubenswrapper[4681]: I1007 17:25:16.379255 4681 scope.go:117] "RemoveContainer" containerID="af63601f836949946b81ec10e42eb0edfd94800d61baa6f37919799bbd67f8db" Oct 07 17:25:16 crc kubenswrapper[4681]: I1007 17:25:16.409659 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 17:25:17 crc kubenswrapper[4681]: I1007 17:25:17.441336 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:25:21 crc kubenswrapper[4681]: I1007 17:25:21.937051 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 17:25:21 crc kubenswrapper[4681]: I1007 17:25:21.937601 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 17:25:21 crc kubenswrapper[4681]: I1007 17:25:21.941482 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 17:25:21 crc kubenswrapper[4681]: I1007 17:25:21.943189 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.363382 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.431670 4681 generic.go:334] "Generic (PLEG): container finished" podID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" containerID="c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711" exitCode=137 Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.431725 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.431763 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1","Type":"ContainerDied","Data":"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711"} Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.431870 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1","Type":"ContainerDied","Data":"28cdff51c4d5911478ff12d6ba6ec0ece83fa0057d976b3b76ac0192b911a89e"} Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.431933 4681 scope.go:117] "RemoveContainer" containerID="c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.454110 4681 scope.go:117] "RemoveContainer" containerID="c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711" Oct 07 17:25:22 crc kubenswrapper[4681]: E1007 17:25:22.454545 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711\": container with ID starting with c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711 not found: ID does not exist" containerID="c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.454577 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711"} err="failed to get container status \"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711\": rpc error: code = NotFound desc = could not find container \"c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711\": container with ID starting with c067c544e931d305de8934b70b4ff57feb8ba2522b13b6b6940e4a65fc80b711 not found: ID does not exist" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.500854 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9r8g\" (UniqueName: \"kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g\") pod \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.501083 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data\") pod \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.501144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle\") pod \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\" (UID: \"39ccace7-9bc9-426e-a9df-a5d58dbe5aa1\") " Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.516125 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g" (OuterVolumeSpecName: "kube-api-access-t9r8g") pod "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" (UID: "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1"). InnerVolumeSpecName "kube-api-access-t9r8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.540121 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" (UID: "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.543265 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data" (OuterVolumeSpecName: "config-data") pod "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" (UID: "39ccace7-9bc9-426e-a9df-a5d58dbe5aa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.605200 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.605224 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.605233 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9r8g\" (UniqueName: \"kubernetes.io/projected/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1-kube-api-access-t9r8g\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.774010 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.789383 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.801914 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:25:22 crc kubenswrapper[4681]: E1007 17:25:22.802340 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.802359 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.802528 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.803139 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.807935 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.808140 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.812822 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.817777 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.912101 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.912277 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2h2\" (UniqueName: \"kubernetes.io/projected/48284d8c-6f51-4fa0-ae29-b933b93a2411-kube-api-access-hk2h2\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.912332 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.912818 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:22 crc kubenswrapper[4681]: I1007 17:25:22.913280 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.015465 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2h2\" (UniqueName: \"kubernetes.io/projected/48284d8c-6f51-4fa0-ae29-b933b93a2411-kube-api-access-hk2h2\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.015514 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.015581 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.015681 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.016239 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.020809 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.021970 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.022163 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.025161 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48284d8c-6f51-4fa0-ae29-b933b93a2411-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.036342 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2h2\" (UniqueName: \"kubernetes.io/projected/48284d8c-6f51-4fa0-ae29-b933b93a2411-kube-api-access-hk2h2\") pod \"nova-cell1-novncproxy-0\" (UID: \"48284d8c-6f51-4fa0-ae29-b933b93a2411\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.041968 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ccace7-9bc9-426e-a9df-a5d58dbe5aa1" path="/var/lib/kubelet/pods/39ccace7-9bc9-426e-a9df-a5d58dbe5aa1/volumes" Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.118574 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:23 crc kubenswrapper[4681]: W1007 17:25:23.564254 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48284d8c_6f51_4fa0_ae29_b933b93a2411.slice/crio-0e4d23f5ec71e9592ddf7d1bab21faedbabe22e03fc95ee0641eac2d039ad4a0 WatchSource:0}: Error finding container 0e4d23f5ec71e9592ddf7d1bab21faedbabe22e03fc95ee0641eac2d039ad4a0: Status 404 returned error can't find the container with id 0e4d23f5ec71e9592ddf7d1bab21faedbabe22e03fc95ee0641eac2d039ad4a0 Oct 07 17:25:23 crc kubenswrapper[4681]: I1007 17:25:23.564898 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.475094 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48284d8c-6f51-4fa0-ae29-b933b93a2411","Type":"ContainerStarted","Data":"b821fa0c8c7b278c4665d4744d63a87460045710d444f6359907993147f6b6a9"} Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.475436 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48284d8c-6f51-4fa0-ae29-b933b93a2411","Type":"ContainerStarted","Data":"0e4d23f5ec71e9592ddf7d1bab21faedbabe22e03fc95ee0641eac2d039ad4a0"} Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.501960 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5019393 podStartE2EDuration="2.5019393s" podCreationTimestamp="2025-10-07 17:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:24.495590423 +0000 UTC m=+1328.143001978" watchObservedRunningTime="2025-10-07 17:25:24.5019393 +0000 UTC m=+1328.149350855" Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.689368 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.689997 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.690398 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 17:25:24 crc kubenswrapper[4681]: I1007 17:25:24.703181 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.485935 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.490218 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.691931 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.693617 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.729272 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777403 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777460 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777577 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp94f\" (UniqueName: \"kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777670 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777734 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.777750 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.879667 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.879705 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.879765 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.879785 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.879860 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp94f\" (UniqueName: \"kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.880323 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.881191 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.881219 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.881233 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.881624 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.881984 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:25 crc kubenswrapper[4681]: I1007 17:25:25.899586 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp94f\" (UniqueName: \"kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f\") pod \"dnsmasq-dns-cd5cbd7b9-jf6hm\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:26 crc kubenswrapper[4681]: I1007 17:25:26.033217 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:26 crc kubenswrapper[4681]: I1007 17:25:26.544580 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.059484 4681 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod01926d51-8e89-44e0-8032-7a701b7fcb92"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod01926d51-8e89-44e0-8032-7a701b7fcb92] : Timed out while waiting for systemd to remove kubepods-besteffort-pod01926d51_8e89_44e0_8032_7a701b7fcb92.slice" Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.441467 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.504179 4681 generic.go:334] "Generic (PLEG): container finished" podID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerID="bc4d6bdb24f157eacff17ab3fb85a399e1b0acbd68259d6ce3970ee795d0c46c" exitCode=0 Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.504245 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" event={"ID":"ff96470c-7e4d-4f94-8ed6-c42fb4d63928","Type":"ContainerDied","Data":"bc4d6bdb24f157eacff17ab3fb85a399e1b0acbd68259d6ce3970ee795d0c46c"} Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.504292 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" event={"ID":"ff96470c-7e4d-4f94-8ed6-c42fb4d63928","Type":"ContainerStarted","Data":"9585a2440f12245e377ada037621f458817a8b87eaf73b3c9ebb64a9f150e392"} Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.990262 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.994035 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-central-agent" containerID="cri-o://3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307" gracePeriod=30 Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.994591 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="proxy-httpd" containerID="cri-o://3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455" gracePeriod=30 Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.994689 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="sg-core" containerID="cri-o://60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996" gracePeriod=30 Oct 07 17:25:27 crc kubenswrapper[4681]: I1007 17:25:27.994727 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-notification-agent" containerID="cri-o://6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264" gracePeriod=30 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.120839 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.255093 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.515474 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" event={"ID":"ff96470c-7e4d-4f94-8ed6-c42fb4d63928","Type":"ContainerStarted","Data":"169b180b499eafdbab027192528b6ce922fffea81f375a56372be81f69df07b9"} Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.515842 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517786 4681 generic.go:334] "Generic (PLEG): container finished" podID="8039090b-be20-41f5-8135-afb87372db43" containerID="3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455" exitCode=0 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517820 4681 generic.go:334] "Generic (PLEG): container finished" podID="8039090b-be20-41f5-8135-afb87372db43" containerID="60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996" exitCode=2 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517834 4681 generic.go:334] "Generic (PLEG): container finished" podID="8039090b-be20-41f5-8135-afb87372db43" containerID="3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307" exitCode=0 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517855 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerDied","Data":"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455"} Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517907 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerDied","Data":"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996"} Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.517922 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerDied","Data":"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307"} Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.518034 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-log" containerID="cri-o://ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4" gracePeriod=30 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.518050 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-api" containerID="cri-o://d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede" gracePeriod=30 Oct 07 17:25:28 crc kubenswrapper[4681]: I1007 17:25:28.543808 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" podStartSLOduration=3.543788833 podStartE2EDuration="3.543788833s" podCreationTimestamp="2025-10-07 17:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:28.538548487 +0000 UTC m=+1332.185960042" watchObservedRunningTime="2025-10-07 17:25:28.543788833 +0000 UTC m=+1332.191200378" Oct 07 17:25:29 crc kubenswrapper[4681]: I1007 17:25:29.528183 4681 generic.go:334] "Generic (PLEG): container finished" podID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerID="ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4" exitCode=143 Oct 07 17:25:29 crc kubenswrapper[4681]: I1007 17:25:29.528241 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerDied","Data":"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4"} Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.115859 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.300294 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data\") pod \"b9a0c104-2595-4646-b20a-7f3975f6874b\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.300361 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle\") pod \"b9a0c104-2595-4646-b20a-7f3975f6874b\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.300481 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q6b6\" (UniqueName: \"kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6\") pod \"b9a0c104-2595-4646-b20a-7f3975f6874b\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.300575 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs\") pod \"b9a0c104-2595-4646-b20a-7f3975f6874b\" (UID: \"b9a0c104-2595-4646-b20a-7f3975f6874b\") " Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.301595 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs" (OuterVolumeSpecName: "logs") pod "b9a0c104-2595-4646-b20a-7f3975f6874b" (UID: "b9a0c104-2595-4646-b20a-7f3975f6874b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.316044 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6" (OuterVolumeSpecName: "kube-api-access-7q6b6") pod "b9a0c104-2595-4646-b20a-7f3975f6874b" (UID: "b9a0c104-2595-4646-b20a-7f3975f6874b"). InnerVolumeSpecName "kube-api-access-7q6b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.337230 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data" (OuterVolumeSpecName: "config-data") pod "b9a0c104-2595-4646-b20a-7f3975f6874b" (UID: "b9a0c104-2595-4646-b20a-7f3975f6874b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.351066 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9a0c104-2595-4646-b20a-7f3975f6874b" (UID: "b9a0c104-2595-4646-b20a-7f3975f6874b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.403938 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.404170 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0c104-2595-4646-b20a-7f3975f6874b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.404277 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q6b6\" (UniqueName: \"kubernetes.io/projected/b9a0c104-2595-4646-b20a-7f3975f6874b-kube-api-access-7q6b6\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.404384 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0c104-2595-4646-b20a-7f3975f6874b-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.554049 4681 generic.go:334] "Generic (PLEG): container finished" podID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerID="d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede" exitCode=0 Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.554309 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerDied","Data":"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede"} Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.554399 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9a0c104-2595-4646-b20a-7f3975f6874b","Type":"ContainerDied","Data":"51fccbe9c0e3e89d1faccf68be0af5f0862ae1befc186360774c3c10b0d5ec9e"} Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.554477 4681 scope.go:117] "RemoveContainer" containerID="d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.554688 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.589743 4681 scope.go:117] "RemoveContainer" containerID="ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.618014 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.634062 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.648318 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:32 crc kubenswrapper[4681]: E1007 17:25:32.648748 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-log" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.648764 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-log" Oct 07 17:25:32 crc kubenswrapper[4681]: E1007 17:25:32.648791 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-api" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.648797 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-api" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.648997 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-api" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.649014 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" containerName="nova-api-log" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.650012 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.652139 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.652307 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.652493 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.661445 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.671329 4681 scope.go:117] "RemoveContainer" containerID="d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede" Oct 07 17:25:32 crc kubenswrapper[4681]: E1007 17:25:32.672521 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede\": container with ID starting with d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede not found: ID does not exist" containerID="d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.672561 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede"} err="failed to get container status \"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede\": rpc error: code = NotFound desc = could not find container \"d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede\": container with ID starting with d092e68c2259e6805749b2e6d0fee7d59542d9d8e278153ac5c31b6f16a08ede not found: ID does not exist" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.672586 4681 scope.go:117] "RemoveContainer" containerID="ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4" Oct 07 17:25:32 crc kubenswrapper[4681]: E1007 17:25:32.675491 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4\": container with ID starting with ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4 not found: ID does not exist" containerID="ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.675558 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4"} err="failed to get container status \"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4\": rpc error: code = NotFound desc = could not find container \"ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4\": container with ID starting with ed1946e9da9f4558a9025bdada061830bb6ce7a05e1701b31e57778ef7f0e7f4 not found: ID does not exist" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811075 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811424 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811495 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2bkc\" (UniqueName: \"kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811585 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811669 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.811817 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915086 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915161 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915231 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915253 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915316 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bkc\" (UniqueName: \"kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.915400 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.916615 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.921471 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.921582 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.927725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.939530 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.945930 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2bkc\" (UniqueName: \"kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc\") pod \"nova-api-0\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " pod="openstack/nova-api-0" Oct 07 17:25:32 crc kubenswrapper[4681]: I1007 17:25:32.991746 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.039752 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a0c104-2595-4646-b20a-7f3975f6874b" path="/var/lib/kubelet/pods/b9a0c104-2595-4646-b20a-7f3975f6874b/volumes" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.074156 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.118958 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.185106 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219494 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219589 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219616 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219637 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219725 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219840 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219867 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m25xm\" (UniqueName: \"kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.219984 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle\") pod \"8039090b-be20-41f5-8135-afb87372db43\" (UID: \"8039090b-be20-41f5-8135-afb87372db43\") " Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.221002 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.221371 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.228705 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm" (OuterVolumeSpecName: "kube-api-access-m25xm") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "kube-api-access-m25xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.249541 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts" (OuterVolumeSpecName: "scripts") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.265009 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.323033 4681 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.323278 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m25xm\" (UniqueName: \"kubernetes.io/projected/8039090b-be20-41f5-8135-afb87372db43-kube-api-access-m25xm\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.323340 4681 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8039090b-be20-41f5-8135-afb87372db43-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.323430 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.323491 4681 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.390113 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.393920 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.394643 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data" (OuterVolumeSpecName: "config-data") pod "8039090b-be20-41f5-8135-afb87372db43" (UID: "8039090b-be20-41f5-8135-afb87372db43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.426380 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.426407 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.426416 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8039090b-be20-41f5-8135-afb87372db43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.542170 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.575124 4681 generic.go:334] "Generic (PLEG): container finished" podID="8039090b-be20-41f5-8135-afb87372db43" containerID="6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264" exitCode=0 Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.575212 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.575181 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerDied","Data":"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264"} Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.575564 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8039090b-be20-41f5-8135-afb87372db43","Type":"ContainerDied","Data":"024e58a2c594e4f915c999192610867c46812b0711878ca70803bc684b2f717b"} Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.575591 4681 scope.go:117] "RemoveContainer" containerID="3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.582556 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerStarted","Data":"7dee9b98a1c298b802bcbce3907f617f31032343af889dca4d32c9b8a4327e4d"} Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.605183 4681 scope.go:117] "RemoveContainer" containerID="60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.607474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.610137 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.616921 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.651939 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.652612 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="proxy-httpd" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652633 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="proxy-httpd" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.652659 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="sg-core" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652667 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="sg-core" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.652710 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-central-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652717 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-central-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.652732 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-notification-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652737 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-notification-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652982 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="sg-core" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.652996 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="proxy-httpd" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.653003 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-notification-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.653016 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8039090b-be20-41f5-8135-afb87372db43" containerName="ceilometer-central-agent" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.654788 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.658257 4681 scope.go:117] "RemoveContainer" containerID="6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.658583 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.658648 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.658816 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.683762 4681 scope.go:117] "RemoveContainer" containerID="3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.695295 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.717830 4681 scope.go:117] "RemoveContainer" containerID="3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.718471 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455\": container with ID starting with 3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455 not found: ID does not exist" containerID="3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.718518 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455"} err="failed to get container status \"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455\": rpc error: code = NotFound desc = could not find container \"3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455\": container with ID starting with 3227ac7d985aef34f42c90f5f24560179833ff14323eaabc6a78792879c0e455 not found: ID does not exist" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.718553 4681 scope.go:117] "RemoveContainer" containerID="60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.718839 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996\": container with ID starting with 60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996 not found: ID does not exist" containerID="60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.718860 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996"} err="failed to get container status \"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996\": rpc error: code = NotFound desc = could not find container \"60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996\": container with ID starting with 60fff8663367fb62de4a85a1c0f9318700d74e2d56b75e39723c84de5fd85996 not found: ID does not exist" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.718903 4681 scope.go:117] "RemoveContainer" containerID="6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.719317 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264\": container with ID starting with 6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264 not found: ID does not exist" containerID="6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.719346 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264"} err="failed to get container status \"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264\": rpc error: code = NotFound desc = could not find container \"6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264\": container with ID starting with 6b6b5ef3678bcf8d7f017217dfee2b64e474d1bde1924a6708487049c4700264 not found: ID does not exist" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.719373 4681 scope.go:117] "RemoveContainer" containerID="3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307" Oct 07 17:25:33 crc kubenswrapper[4681]: E1007 17:25:33.719551 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307\": container with ID starting with 3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307 not found: ID does not exist" containerID="3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.719567 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307"} err="failed to get container status \"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307\": rpc error: code = NotFound desc = could not find container \"3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307\": container with ID starting with 3d8909ae6f5acd852247904740cf548c36b8fd8ebb063450909fe225f6e0a307 not found: ID does not exist" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.792052 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fsn5z"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.793219 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.798047 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.798327 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.821533 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fsn5z"] Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840008 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-config-data\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840202 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840269 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840316 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-run-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840381 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qb2\" (UniqueName: \"kubernetes.io/projected/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-kube-api-access-h4qb2\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840446 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840530 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-log-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.840570 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-scripts\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.943699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-config-data\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.943780 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.943861 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.944598 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.944677 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-run-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.944698 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.944807 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.944843 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qb2\" (UniqueName: \"kubernetes.io/projected/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-kube-api-access-h4qb2\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.945292 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzh2m\" (UniqueName: \"kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.945354 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.945477 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-log-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.945518 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-scripts\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.948483 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-config-data\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.949641 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.950949 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-log-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.954312 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-scripts\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.955197 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.957491 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-run-httpd\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.958135 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.975073 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qb2\" (UniqueName: \"kubernetes.io/projected/c8863ad2-0fce-42cc-aae0-cd51fe7a79ab-kube-api-access-h4qb2\") pod \"ceilometer-0\" (UID: \"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab\") " pod="openstack/ceilometer-0" Oct 07 17:25:33 crc kubenswrapper[4681]: I1007 17:25:33.987983 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.047097 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.047212 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.047271 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.047312 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzh2m\" (UniqueName: \"kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.051492 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.052703 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.053257 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.063105 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzh2m\" (UniqueName: \"kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m\") pod \"nova-cell1-cell-mapping-fsn5z\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.164945 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:34 crc kubenswrapper[4681]: W1007 17:25:34.504570 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8863ad2_0fce_42cc_aae0_cd51fe7a79ab.slice/crio-b3309eb43c2d0006b2638e976d21ba0523d35d7e9df14193be9952724eb7cd81 WatchSource:0}: Error finding container b3309eb43c2d0006b2638e976d21ba0523d35d7e9df14193be9952724eb7cd81: Status 404 returned error can't find the container with id b3309eb43c2d0006b2638e976d21ba0523d35d7e9df14193be9952724eb7cd81 Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.513424 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.591698 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab","Type":"ContainerStarted","Data":"b3309eb43c2d0006b2638e976d21ba0523d35d7e9df14193be9952724eb7cd81"} Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.594366 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerStarted","Data":"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83"} Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.594405 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerStarted","Data":"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3"} Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.612217 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6121979939999997 podStartE2EDuration="2.612197994s" podCreationTimestamp="2025-10-07 17:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:34.61205886 +0000 UTC m=+1338.259470415" watchObservedRunningTime="2025-10-07 17:25:34.612197994 +0000 UTC m=+1338.259609539" Oct 07 17:25:34 crc kubenswrapper[4681]: I1007 17:25:34.656358 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fsn5z"] Oct 07 17:25:34 crc kubenswrapper[4681]: W1007 17:25:34.662852 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d57f2b_808f_4924_806d_b88ea028039b.slice/crio-3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868 WatchSource:0}: Error finding container 3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868: Status 404 returned error can't find the container with id 3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868 Oct 07 17:25:35 crc kubenswrapper[4681]: I1007 17:25:35.042918 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8039090b-be20-41f5-8135-afb87372db43" path="/var/lib/kubelet/pods/8039090b-be20-41f5-8135-afb87372db43/volumes" Oct 07 17:25:35 crc kubenswrapper[4681]: I1007 17:25:35.605714 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab","Type":"ContainerStarted","Data":"01575ece4f26e2c0d4d9dab0484bd18f073d39afe2ba3794cc9dd1c99547e23a"} Oct 07 17:25:35 crc kubenswrapper[4681]: I1007 17:25:35.607052 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fsn5z" event={"ID":"31d57f2b-808f-4924-806d-b88ea028039b","Type":"ContainerStarted","Data":"a9a5c7dbc52eee72cedd56d516f7a7fd2ba3f37ec1281ad0681a0aea2bf8bd0f"} Oct 07 17:25:35 crc kubenswrapper[4681]: I1007 17:25:35.607175 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fsn5z" event={"ID":"31d57f2b-808f-4924-806d-b88ea028039b","Type":"ContainerStarted","Data":"3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868"} Oct 07 17:25:35 crc kubenswrapper[4681]: I1007 17:25:35.627556 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fsn5z" podStartSLOduration=2.627538431 podStartE2EDuration="2.627538431s" podCreationTimestamp="2025-10-07 17:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:35.620382061 +0000 UTC m=+1339.267793616" watchObservedRunningTime="2025-10-07 17:25:35.627538431 +0000 UTC m=+1339.274949986" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.034701 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.114075 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.114318 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="dnsmasq-dns" containerID="cri-o://20cef2b951410633979fbe0d3d90b743a2f268a7f89f7f02c5efcb529693f4a8" gracePeriod=10 Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.622627 4681 generic.go:334] "Generic (PLEG): container finished" podID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerID="20cef2b951410633979fbe0d3d90b743a2f268a7f89f7f02c5efcb529693f4a8" exitCode=0 Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.622670 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerDied","Data":"20cef2b951410633979fbe0d3d90b743a2f268a7f89f7f02c5efcb529693f4a8"} Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.718565 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822099 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822216 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822295 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822365 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822426 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfp9h\" (UniqueName: \"kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.822467 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc\") pod \"13cc700b-7284-4106-aa2c-3d83ef58b00a\" (UID: \"13cc700b-7284-4106-aa2c-3d83ef58b00a\") " Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.859283 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h" (OuterVolumeSpecName: "kube-api-access-lfp9h") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "kube-api-access-lfp9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.901819 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.923344 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.942090 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfp9h\" (UniqueName: \"kubernetes.io/projected/13cc700b-7284-4106-aa2c-3d83ef58b00a-kube-api-access-lfp9h\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.942123 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.942132 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.973028 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config" (OuterVolumeSpecName: "config") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:36 crc kubenswrapper[4681]: I1007 17:25:36.991177 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.003942 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13cc700b-7284-4106-aa2c-3d83ef58b00a" (UID: "13cc700b-7284-4106-aa2c-3d83ef58b00a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.043999 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.044032 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.044046 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13cc700b-7284-4106-aa2c-3d83ef58b00a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.440977 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64677bd694-6xgb2" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.142:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.142:8443: connect: connection refused" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.441334 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.634971 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" event={"ID":"13cc700b-7284-4106-aa2c-3d83ef58b00a","Type":"ContainerDied","Data":"823764cd564594629f8b01391ac91eb972bec2648f1fad1ed6ed91aa682732e9"} Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.635026 4681 scope.go:117] "RemoveContainer" containerID="20cef2b951410633979fbe0d3d90b743a2f268a7f89f7f02c5efcb529693f4a8" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.634995 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-mcftl" Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.637074 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab","Type":"ContainerStarted","Data":"f63b0eb98b063f6b2966e7295b40be0640c5119bc214183cc20b42109392f764"} Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.659276 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.668821 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-mcftl"] Oct 07 17:25:37 crc kubenswrapper[4681]: I1007 17:25:37.764492 4681 scope.go:117] "RemoveContainer" containerID="d240bf962a7b81b1551dfbe97931d983e97e348485ca8ce87aa45e1eb3589f9f" Oct 07 17:25:38 crc kubenswrapper[4681]: I1007 17:25:38.649818 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab","Type":"ContainerStarted","Data":"f61667760e9e9db7fbf5fd8f5f4e1e7f50a8f34135186cda1b50bf1d1a10093c"} Oct 07 17:25:39 crc kubenswrapper[4681]: I1007 17:25:39.039154 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" path="/var/lib/kubelet/pods/13cc700b-7284-4106-aa2c-3d83ef58b00a/volumes" Oct 07 17:25:39 crc kubenswrapper[4681]: I1007 17:25:39.663083 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8863ad2-0fce-42cc-aae0-cd51fe7a79ab","Type":"ContainerStarted","Data":"5c9f12b2334dc2701e414119ecd803cdfce9f5d24897043b9f321fd747e3c42f"} Oct 07 17:25:39 crc kubenswrapper[4681]: I1007 17:25:39.663583 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 17:25:39 crc kubenswrapper[4681]: I1007 17:25:39.686081 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.951325373 podStartE2EDuration="6.686058199s" podCreationTimestamp="2025-10-07 17:25:33 +0000 UTC" firstStartedPulling="2025-10-07 17:25:34.507306312 +0000 UTC m=+1338.154717867" lastFinishedPulling="2025-10-07 17:25:39.242039138 +0000 UTC m=+1342.889450693" observedRunningTime="2025-10-07 17:25:39.680923066 +0000 UTC m=+1343.328334641" watchObservedRunningTime="2025-10-07 17:25:39.686058199 +0000 UTC m=+1343.333469754" Oct 07 17:25:40 crc kubenswrapper[4681]: I1007 17:25:40.675935 4681 generic.go:334] "Generic (PLEG): container finished" podID="31d57f2b-808f-4924-806d-b88ea028039b" containerID="a9a5c7dbc52eee72cedd56d516f7a7fd2ba3f37ec1281ad0681a0aea2bf8bd0f" exitCode=0 Oct 07 17:25:40 crc kubenswrapper[4681]: I1007 17:25:40.676014 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fsn5z" event={"ID":"31d57f2b-808f-4924-806d-b88ea028039b","Type":"ContainerDied","Data":"a9a5c7dbc52eee72cedd56d516f7a7fd2ba3f37ec1281ad0681a0aea2bf8bd0f"} Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.071855 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.140584 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzh2m\" (UniqueName: \"kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m\") pod \"31d57f2b-808f-4924-806d-b88ea028039b\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.140742 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle\") pod \"31d57f2b-808f-4924-806d-b88ea028039b\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.140812 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data\") pod \"31d57f2b-808f-4924-806d-b88ea028039b\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.140858 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts\") pod \"31d57f2b-808f-4924-806d-b88ea028039b\" (UID: \"31d57f2b-808f-4924-806d-b88ea028039b\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.162420 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m" (OuterVolumeSpecName: "kube-api-access-jzh2m") pod "31d57f2b-808f-4924-806d-b88ea028039b" (UID: "31d57f2b-808f-4924-806d-b88ea028039b"). InnerVolumeSpecName "kube-api-access-jzh2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.175947 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts" (OuterVolumeSpecName: "scripts") pod "31d57f2b-808f-4924-806d-b88ea028039b" (UID: "31d57f2b-808f-4924-806d-b88ea028039b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.178671 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31d57f2b-808f-4924-806d-b88ea028039b" (UID: "31d57f2b-808f-4924-806d-b88ea028039b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.194018 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data" (OuterVolumeSpecName: "config-data") pod "31d57f2b-808f-4924-806d-b88ea028039b" (UID: "31d57f2b-808f-4924-806d-b88ea028039b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.243060 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.243094 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzh2m\" (UniqueName: \"kubernetes.io/projected/31d57f2b-808f-4924-806d-b88ea028039b-kube-api-access-jzh2m\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.243106 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.243115 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d57f2b-808f-4924-806d-b88ea028039b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.694341 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fsn5z" event={"ID":"31d57f2b-808f-4924-806d-b88ea028039b","Type":"ContainerDied","Data":"3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868"} Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.694924 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f494abf64d2f4a5c32f85804291dfef3c33d162103709e1fab4009b68d12868" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.694597 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fsn5z" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.701554 4681 generic.go:334] "Generic (PLEG): container finished" podID="990e1913-44d7-414b-a116-6b712547fc81" containerID="f12687af7e3841ca2a53c32deb4a7158e2c0c873f8ce45fbe4d823c0abd5a391" exitCode=137 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.701603 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerDied","Data":"f12687af7e3841ca2a53c32deb4a7158e2c0c873f8ce45fbe4d823c0abd5a391"} Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.765084 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:25:42 crc kubenswrapper[4681]: E1007 17:25:42.829358 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d57f2b_808f_4924_806d_b88ea028039b.slice\": RecentStats: unable to find data in memory cache]" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.855992 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856171 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856214 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856235 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856290 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856364 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.856491 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f84qg\" (UniqueName: \"kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg\") pod \"990e1913-44d7-414b-a116-6b712547fc81\" (UID: \"990e1913-44d7-414b-a116-6b712547fc81\") " Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.861991 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs" (OuterVolumeSpecName: "logs") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.871353 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.871697 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg" (OuterVolumeSpecName: "kube-api-access-f84qg") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "kube-api-access-f84qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.911609 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.911819 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-log" containerID="cri-o://ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" gracePeriod=30 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.911969 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-api" containerID="cri-o://5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" gracePeriod=30 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.924849 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.937902 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.938185 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerName="nova-scheduler-scheduler" containerID="cri-o://e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" gracePeriod=30 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.947338 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.947567 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-log" containerID="cri-o://48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872" gracePeriod=30 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.947975 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-metadata" containerID="cri-o://14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06" gracePeriod=30 Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.959044 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f84qg\" (UniqueName: \"kubernetes.io/projected/990e1913-44d7-414b-a116-6b712547fc81-kube-api-access-f84qg\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.959080 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.959090 4681 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.959099 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990e1913-44d7-414b-a116-6b712547fc81-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.969757 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data" (OuterVolumeSpecName: "config-data") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:42 crc kubenswrapper[4681]: I1007 17:25:42.981497 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts" (OuterVolumeSpecName: "scripts") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.006367 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "990e1913-44d7-414b-a116-6b712547fc81" (UID: "990e1913-44d7-414b-a116-6b712547fc81"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.061476 4681 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/990e1913-44d7-414b-a116-6b712547fc81-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.061512 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.061524 4681 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/990e1913-44d7-414b-a116-6b712547fc81-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.610534 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674600 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2bkc\" (UniqueName: \"kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674698 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674732 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674778 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674841 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.674986 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs\") pod \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\" (UID: \"a2bee682-4d68-4edf-9596-e1041fd9a8b5\") " Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.675782 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs" (OuterVolumeSpecName: "logs") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.678861 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc" (OuterVolumeSpecName: "kube-api-access-q2bkc") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "kube-api-access-q2bkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.704814 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717596 4681 generic.go:334] "Generic (PLEG): container finished" podID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerID="5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" exitCode=0 Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717631 4681 generic.go:334] "Generic (PLEG): container finished" podID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerID="ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" exitCode=143 Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717716 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerDied","Data":"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83"} Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717703 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717744 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerDied","Data":"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3"} Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717756 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a2bee682-4d68-4edf-9596-e1041fd9a8b5","Type":"ContainerDied","Data":"7dee9b98a1c298b802bcbce3907f617f31032343af889dca4d32c9b8a4327e4d"} Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.717771 4681 scope.go:117] "RemoveContainer" containerID="5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.733512 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64677bd694-6xgb2" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.734653 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64677bd694-6xgb2" event={"ID":"990e1913-44d7-414b-a116-6b712547fc81","Type":"ContainerDied","Data":"a4a63533711ff63ac127f62f10923e4573a91a5f48356a8d8ac2c8ae6c22c5bf"} Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.744569 4681 generic.go:334] "Generic (PLEG): container finished" podID="c6957504-c035-489e-95d3-3cab2485c2b0" containerID="48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872" exitCode=143 Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.744608 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerDied","Data":"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872"} Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.744702 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.758409 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.764114 4681 scope.go:117] "RemoveContainer" containerID="ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.764251 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.781695 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2bee682-4d68-4edf-9596-e1041fd9a8b5-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.781923 4681 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.781933 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2bkc\" (UniqueName: \"kubernetes.io/projected/a2bee682-4d68-4edf-9596-e1041fd9a8b5-kube-api-access-q2bkc\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.781942 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.781950 4681 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.784780 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64677bd694-6xgb2"] Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.785141 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data" (OuterVolumeSpecName: "config-data") pod "a2bee682-4d68-4edf-9596-e1041fd9a8b5" (UID: "a2bee682-4d68-4edf-9596-e1041fd9a8b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.800861 4681 scope.go:117] "RemoveContainer" containerID="5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" Oct 07 17:25:43 crc kubenswrapper[4681]: E1007 17:25:43.801346 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83\": container with ID starting with 5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83 not found: ID does not exist" containerID="5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.801378 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83"} err="failed to get container status \"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83\": rpc error: code = NotFound desc = could not find container \"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83\": container with ID starting with 5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83 not found: ID does not exist" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.801399 4681 scope.go:117] "RemoveContainer" containerID="ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" Oct 07 17:25:43 crc kubenswrapper[4681]: E1007 17:25:43.801820 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3\": container with ID starting with ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3 not found: ID does not exist" containerID="ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.801852 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3"} err="failed to get container status \"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3\": rpc error: code = NotFound desc = could not find container \"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3\": container with ID starting with ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3 not found: ID does not exist" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.801871 4681 scope.go:117] "RemoveContainer" containerID="5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.803098 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83"} err="failed to get container status \"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83\": rpc error: code = NotFound desc = could not find container \"5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83\": container with ID starting with 5201b3ed329d4cda593231fb84805bab2bf694cb5ee97d938f028bace7bd2d83 not found: ID does not exist" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.803119 4681 scope.go:117] "RemoveContainer" containerID="ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.803587 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3"} err="failed to get container status \"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3\": rpc error: code = NotFound desc = could not find container \"ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3\": container with ID starting with ee93a82a30041ccfdd5537cf0b8db1930cb1119c3cd94148eca1649f3c4505b3 not found: ID does not exist" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.803605 4681 scope.go:117] "RemoveContainer" containerID="b4889e462f03c208394a02d8c27c149d4669b02ad5367737278bbdc6137dfbb3" Oct 07 17:25:43 crc kubenswrapper[4681]: I1007 17:25:43.884418 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2bee682-4d68-4edf-9596-e1041fd9a8b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.016599 4681 scope.go:117] "RemoveContainer" containerID="f12687af7e3841ca2a53c32deb4a7158e2c0c873f8ce45fbe4d823c0abd5a391" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.123516 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.133529 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.160979 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161338 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161354 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161376 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-log" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161382 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-log" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161399 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161406 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161417 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-api" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161422 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-api" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161435 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon-log" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161441 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon-log" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161452 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="init" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161457 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="init" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161472 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="dnsmasq-dns" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161477 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="dnsmasq-dns" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161494 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d57f2b-808f-4924-806d-b88ea028039b" containerName="nova-manage" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161500 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d57f2b-808f-4924-806d-b88ea028039b" containerName="nova-manage" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161665 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161681 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d57f2b-808f-4924-806d-b88ea028039b" containerName="nova-manage" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161690 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-api" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161699 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon-log" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161709 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161717 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161725 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cc700b-7284-4106-aa2c-3d83ef58b00a" containerName="dnsmasq-dns" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161739 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" containerName="nova-api-log" Oct 07 17:25:44 crc kubenswrapper[4681]: E1007 17:25:44.161921 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.161928 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="990e1913-44d7-414b-a116-6b712547fc81" containerName="horizon" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.162773 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.167326 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.167620 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.167768 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.185544 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196291 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tf8d\" (UniqueName: \"kubernetes.io/projected/9241da9a-f1bd-4d93-bd72-f84e5dd85083-kube-api-access-6tf8d\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196382 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-config-data\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196458 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9241da9a-f1bd-4d93-bd72-f84e5dd85083-logs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196515 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196544 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.196577 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-public-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298577 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9241da9a-f1bd-4d93-bd72-f84e5dd85083-logs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298660 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298692 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-public-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298817 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tf8d\" (UniqueName: \"kubernetes.io/projected/9241da9a-f1bd-4d93-bd72-f84e5dd85083-kube-api-access-6tf8d\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.298860 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-config-data\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.299974 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9241da9a-f1bd-4d93-bd72-f84e5dd85083-logs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.304471 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.305670 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-public-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.309716 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.313708 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9241da9a-f1bd-4d93-bd72-f84e5dd85083-config-data\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.320122 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tf8d\" (UniqueName: \"kubernetes.io/projected/9241da9a-f1bd-4d93-bd72-f84e5dd85083-kube-api-access-6tf8d\") pod \"nova-api-0\" (UID: \"9241da9a-f1bd-4d93-bd72-f84e5dd85083\") " pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.481525 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.790189 4681 generic.go:334] "Generic (PLEG): container finished" podID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerID="e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" exitCode=0 Oct 07 17:25:44 crc kubenswrapper[4681]: I1007 17:25:44.790238 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92a58956-0b94-4fcf-85a3-1d185f0e906f","Type":"ContainerDied","Data":"e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f"} Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:44.927396 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.046529 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="990e1913-44d7-414b-a116-6b712547fc81" path="/var/lib/kubelet/pods/990e1913-44d7-414b-a116-6b712547fc81/volumes" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.047188 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2bee682-4d68-4edf-9596-e1041fd9a8b5" path="/var/lib/kubelet/pods/a2bee682-4d68-4edf-9596-e1041fd9a8b5/volumes" Oct 07 17:25:45 crc kubenswrapper[4681]: E1007 17:25:45.707469 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f is running failed: container process not found" containerID="e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 17:25:45 crc kubenswrapper[4681]: E1007 17:25:45.708175 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f is running failed: container process not found" containerID="e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 17:25:45 crc kubenswrapper[4681]: E1007 17:25:45.708584 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f is running failed: container process not found" containerID="e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 17:25:45 crc kubenswrapper[4681]: E1007 17:25:45.708620 4681 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerName="nova-scheduler-scheduler" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.801573 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9241da9a-f1bd-4d93-bd72-f84e5dd85083","Type":"ContainerStarted","Data":"52951433d15e9ddb97d41ffc76ed14805a2eeea542ac80abd2f4bf9d4c68d20f"} Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.801615 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9241da9a-f1bd-4d93-bd72-f84e5dd85083","Type":"ContainerStarted","Data":"e82bf88fef7bb29df02729c570f6734ae485897ee3abf6fe9a64344473b151ad"} Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.801625 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9241da9a-f1bd-4d93-bd72-f84e5dd85083","Type":"ContainerStarted","Data":"bbdb1b96d6c6c6ddd66dbb5266c03b490636f58ee08de9111424e5ba88243eb7"} Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.838805 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.838787769 podStartE2EDuration="1.838787769s" podCreationTimestamp="2025-10-07 17:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:45.826656781 +0000 UTC m=+1349.474068356" watchObservedRunningTime="2025-10-07 17:25:45.838787769 +0000 UTC m=+1349.486199324" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.847496 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.935152 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvz9c\" (UniqueName: \"kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c\") pod \"92a58956-0b94-4fcf-85a3-1d185f0e906f\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.935216 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data\") pod \"92a58956-0b94-4fcf-85a3-1d185f0e906f\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.935289 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle\") pod \"92a58956-0b94-4fcf-85a3-1d185f0e906f\" (UID: \"92a58956-0b94-4fcf-85a3-1d185f0e906f\") " Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.945087 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c" (OuterVolumeSpecName: "kube-api-access-wvz9c") pod "92a58956-0b94-4fcf-85a3-1d185f0e906f" (UID: "92a58956-0b94-4fcf-85a3-1d185f0e906f"). InnerVolumeSpecName "kube-api-access-wvz9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.977950 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data" (OuterVolumeSpecName: "config-data") pod "92a58956-0b94-4fcf-85a3-1d185f0e906f" (UID: "92a58956-0b94-4fcf-85a3-1d185f0e906f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:45 crc kubenswrapper[4681]: I1007 17:25:45.996348 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a58956-0b94-4fcf-85a3-1d185f0e906f" (UID: "92a58956-0b94-4fcf-85a3-1d185f0e906f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.038027 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvz9c\" (UniqueName: \"kubernetes.io/projected/92a58956-0b94-4fcf-85a3-1d185f0e906f-kube-api-access-wvz9c\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.038062 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.038077 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a58956-0b94-4fcf-85a3-1d185f0e906f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.624444 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.648741 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k2cg\" (UniqueName: \"kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg\") pod \"c6957504-c035-489e-95d3-3cab2485c2b0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.649194 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data\") pod \"c6957504-c035-489e-95d3-3cab2485c2b0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.649319 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs\") pod \"c6957504-c035-489e-95d3-3cab2485c2b0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.649350 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs\") pod \"c6957504-c035-489e-95d3-3cab2485c2b0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.649377 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle\") pod \"c6957504-c035-489e-95d3-3cab2485c2b0\" (UID: \"c6957504-c035-489e-95d3-3cab2485c2b0\") " Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.649935 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs" (OuterVolumeSpecName: "logs") pod "c6957504-c035-489e-95d3-3cab2485c2b0" (UID: "c6957504-c035-489e-95d3-3cab2485c2b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.668707 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg" (OuterVolumeSpecName: "kube-api-access-9k2cg") pod "c6957504-c035-489e-95d3-3cab2485c2b0" (UID: "c6957504-c035-489e-95d3-3cab2485c2b0"). InnerVolumeSpecName "kube-api-access-9k2cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.732303 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data" (OuterVolumeSpecName: "config-data") pod "c6957504-c035-489e-95d3-3cab2485c2b0" (UID: "c6957504-c035-489e-95d3-3cab2485c2b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.742671 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6957504-c035-489e-95d3-3cab2485c2b0" (UID: "c6957504-c035-489e-95d3-3cab2485c2b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.751504 4681 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6957504-c035-489e-95d3-3cab2485c2b0-logs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.751544 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.751558 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k2cg\" (UniqueName: \"kubernetes.io/projected/c6957504-c035-489e-95d3-3cab2485c2b0-kube-api-access-9k2cg\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.751569 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.774108 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c6957504-c035-489e-95d3-3cab2485c2b0" (UID: "c6957504-c035-489e-95d3-3cab2485c2b0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.817701 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92a58956-0b94-4fcf-85a3-1d185f0e906f","Type":"ContainerDied","Data":"b6ccb56153ff7cc2c060aeedfe4eb15fcb101ba8c3f304424c1ffc38bc3101bb"} Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.817749 4681 scope.go:117] "RemoveContainer" containerID="e0691782af1fe751c4b9e8b818132858a96619ed4e805dc64adb28dc13d6989f" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.817837 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.832364 4681 generic.go:334] "Generic (PLEG): container finished" podID="c6957504-c035-489e-95d3-3cab2485c2b0" containerID="14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06" exitCode=0 Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.833375 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerDied","Data":"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06"} Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.833428 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c6957504-c035-489e-95d3-3cab2485c2b0","Type":"ContainerDied","Data":"6641f546e2c444e5770aa63c51ecb67fbb69b8fb6e823e426acba700f3138f4f"} Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.833383 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.854573 4681 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6957504-c035-489e-95d3-3cab2485c2b0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.879894 4681 scope.go:117] "RemoveContainer" containerID="14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.903768 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.929989 4681 scope.go:117] "RemoveContainer" containerID="48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.930693 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.949678 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.962026 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.971933 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: E1007 17:25:46.972486 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-log" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972503 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-log" Oct 07 17:25:46 crc kubenswrapper[4681]: E1007 17:25:46.972522 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerName="nova-scheduler-scheduler" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972530 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerName="nova-scheduler-scheduler" Oct 07 17:25:46 crc kubenswrapper[4681]: E1007 17:25:46.972555 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-metadata" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972563 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-metadata" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972809 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-log" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972837 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" containerName="nova-scheduler-scheduler" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.972851 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" containerName="nova-metadata-metadata" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.973617 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.981177 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.983824 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.988956 4681 scope.go:117] "RemoveContainer" containerID="14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.991371 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:46 crc kubenswrapper[4681]: E1007 17:25:46.997137 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06\": container with ID starting with 14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06 not found: ID does not exist" containerID="14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.997194 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06"} err="failed to get container status \"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06\": rpc error: code = NotFound desc = could not find container \"14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06\": container with ID starting with 14d18da14598eb1d8e16120ac6d00143ca57ebece7658e61bb9bdb3fc0a34e06 not found: ID does not exist" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.997225 4681 scope.go:117] "RemoveContainer" containerID="48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.997848 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:46 crc kubenswrapper[4681]: E1007 17:25:46.997868 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872\": container with ID starting with 48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872 not found: ID does not exist" containerID="48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.997923 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872"} err="failed to get container status \"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872\": rpc error: code = NotFound desc = could not find container \"48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872\": container with ID starting with 48b1afc9744caf6610a0da91855ce37fdf399ad2c93bca28a113f1e46a2a6872 not found: ID does not exist" Oct 07 17:25:46 crc kubenswrapper[4681]: I1007 17:25:46.999312 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.001272 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.001441 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.053505 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a58956-0b94-4fcf-85a3-1d185f0e906f" path="/var/lib/kubelet/pods/92a58956-0b94-4fcf-85a3-1d185f0e906f/volumes" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.055774 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6957504-c035-489e-95d3-3cab2485c2b0" path="/var/lib/kubelet/pods/c6957504-c035-489e-95d3-3cab2485c2b0/volumes" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058366 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058425 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e80aacf-4a39-48b9-96c3-692936cf2855-logs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058466 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-config-data\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058492 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058543 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058571 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5b5\" (UniqueName: \"kubernetes.io/projected/51290795-4e81-4099-ab84-e9529128d78a-kube-api-access-4q5b5\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058596 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2nzj\" (UniqueName: \"kubernetes.io/projected/5e80aacf-4a39-48b9-96c3-692936cf2855-kube-api-access-s2nzj\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.058647 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-config-data\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.159816 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.159896 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5b5\" (UniqueName: \"kubernetes.io/projected/51290795-4e81-4099-ab84-e9529128d78a-kube-api-access-4q5b5\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.159926 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2nzj\" (UniqueName: \"kubernetes.io/projected/5e80aacf-4a39-48b9-96c3-692936cf2855-kube-api-access-s2nzj\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.159998 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-config-data\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.160056 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.160093 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e80aacf-4a39-48b9-96c3-692936cf2855-logs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.160138 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-config-data\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.160166 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.160962 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e80aacf-4a39-48b9-96c3-692936cf2855-logs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.163188 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.163383 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.164682 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.174720 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e80aacf-4a39-48b9-96c3-692936cf2855-config-data\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.175131 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51290795-4e81-4099-ab84-e9529128d78a-config-data\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.178367 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2nzj\" (UniqueName: \"kubernetes.io/projected/5e80aacf-4a39-48b9-96c3-692936cf2855-kube-api-access-s2nzj\") pod \"nova-metadata-0\" (UID: \"5e80aacf-4a39-48b9-96c3-692936cf2855\") " pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.178900 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5b5\" (UniqueName: \"kubernetes.io/projected/51290795-4e81-4099-ab84-e9529128d78a-kube-api-access-4q5b5\") pod \"nova-scheduler-0\" (UID: \"51290795-4e81-4099-ab84-e9529128d78a\") " pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.335281 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.357217 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.843442 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 17:25:47 crc kubenswrapper[4681]: W1007 17:25:47.844684 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51290795_4e81_4099_ab84_e9529128d78a.slice/crio-c5ef3d40ec671ea2b2009f1a38631541ace7b93b26b07d2b37dd332166285243 WatchSource:0}: Error finding container c5ef3d40ec671ea2b2009f1a38631541ace7b93b26b07d2b37dd332166285243: Status 404 returned error can't find the container with id c5ef3d40ec671ea2b2009f1a38631541ace7b93b26b07d2b37dd332166285243 Oct 07 17:25:47 crc kubenswrapper[4681]: I1007 17:25:47.906662 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 17:25:47 crc kubenswrapper[4681]: W1007 17:25:47.908837 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e80aacf_4a39_48b9_96c3_692936cf2855.slice/crio-16073534191eff0817ce87b3162712afb49dca5908db3bff1ac08e79ed0a978a WatchSource:0}: Error finding container 16073534191eff0817ce87b3162712afb49dca5908db3bff1ac08e79ed0a978a: Status 404 returned error can't find the container with id 16073534191eff0817ce87b3162712afb49dca5908db3bff1ac08e79ed0a978a Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.866596 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e80aacf-4a39-48b9-96c3-692936cf2855","Type":"ContainerStarted","Data":"6a2842b28c32b2450bcc51b27ef2ec46016527983f8d4e557de8efdafc02e36a"} Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.866896 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e80aacf-4a39-48b9-96c3-692936cf2855","Type":"ContainerStarted","Data":"cfc718a54a6fe8aefe0b149566257b089ec81ff088499b044aad3f2eca5302e5"} Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.866912 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e80aacf-4a39-48b9-96c3-692936cf2855","Type":"ContainerStarted","Data":"16073534191eff0817ce87b3162712afb49dca5908db3bff1ac08e79ed0a978a"} Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.868265 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51290795-4e81-4099-ab84-e9529128d78a","Type":"ContainerStarted","Data":"77c3050de45763820398c642be56bea33151fa19f233c7e0db4210dc5dda2f2a"} Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.868284 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51290795-4e81-4099-ab84-e9529128d78a","Type":"ContainerStarted","Data":"c5ef3d40ec671ea2b2009f1a38631541ace7b93b26b07d2b37dd332166285243"} Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.885600 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.88558644 podStartE2EDuration="2.88558644s" podCreationTimestamp="2025-10-07 17:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:48.884367966 +0000 UTC m=+1352.531779521" watchObservedRunningTime="2025-10-07 17:25:48.88558644 +0000 UTC m=+1352.532997995" Oct 07 17:25:48 crc kubenswrapper[4681]: I1007 17:25:48.920577 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.920550414 podStartE2EDuration="2.920550414s" podCreationTimestamp="2025-10-07 17:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:25:48.904221949 +0000 UTC m=+1352.551633524" watchObservedRunningTime="2025-10-07 17:25:48.920550414 +0000 UTC m=+1352.567961969" Oct 07 17:25:52 crc kubenswrapper[4681]: I1007 17:25:52.336203 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 17:25:52 crc kubenswrapper[4681]: I1007 17:25:52.357716 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:25:52 crc kubenswrapper[4681]: I1007 17:25:52.357975 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 17:25:54 crc kubenswrapper[4681]: I1007 17:25:54.481773 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:25:54 crc kubenswrapper[4681]: I1007 17:25:54.482218 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 17:25:55 crc kubenswrapper[4681]: I1007 17:25:55.494044 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9241da9a-f1bd-4d93-bd72-f84e5dd85083" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:55 crc kubenswrapper[4681]: I1007 17:25:55.494087 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9241da9a-f1bd-4d93-bd72-f84e5dd85083" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:57 crc kubenswrapper[4681]: I1007 17:25:57.336284 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 17:25:57 crc kubenswrapper[4681]: I1007 17:25:57.357835 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 17:25:57 crc kubenswrapper[4681]: I1007 17:25:57.357913 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 17:25:57 crc kubenswrapper[4681]: I1007 17:25:57.363990 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 17:25:58 crc kubenswrapper[4681]: I1007 17:25:58.003052 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 17:25:58 crc kubenswrapper[4681]: I1007 17:25:58.374059 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e80aacf-4a39-48b9-96c3-692936cf2855" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:25:58 crc kubenswrapper[4681]: I1007 17:25:58.374097 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e80aacf-4a39-48b9-96c3-692936cf2855" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 17:26:04 crc kubenswrapper[4681]: I1007 17:26:04.000356 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 17:26:04 crc kubenswrapper[4681]: I1007 17:26:04.487596 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 17:26:04 crc kubenswrapper[4681]: I1007 17:26:04.488175 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 17:26:04 crc kubenswrapper[4681]: I1007 17:26:04.493026 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 17:26:04 crc kubenswrapper[4681]: I1007 17:26:04.494274 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 17:26:05 crc kubenswrapper[4681]: I1007 17:26:05.042303 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 17:26:05 crc kubenswrapper[4681]: I1007 17:26:05.047229 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 17:26:07 crc kubenswrapper[4681]: I1007 17:26:07.366279 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 17:26:07 crc kubenswrapper[4681]: I1007 17:26:07.369109 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 17:26:07 crc kubenswrapper[4681]: I1007 17:26:07.373590 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 17:26:08 crc kubenswrapper[4681]: I1007 17:26:08.067096 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 17:26:15 crc kubenswrapper[4681]: I1007 17:26:15.638213 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:16 crc kubenswrapper[4681]: I1007 17:26:16.438733 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:20 crc kubenswrapper[4681]: I1007 17:26:20.727768 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="rabbitmq" containerID="cri-o://4b0386f421398abfcce82612a18baa1f10211504d81960f450d73cc421a70798" gracePeriod=604795 Oct 07 17:26:21 crc kubenswrapper[4681]: I1007 17:26:21.001275 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="rabbitmq" containerID="cri-o://6924af7e009ce21c3779524f061005b2d457d3c14b2242e4ae72a1082282a1db" gracePeriod=604796 Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.253847 4681 generic.go:334] "Generic (PLEG): container finished" podID="44a71bcd-3178-4394-8031-673c93a6981e" containerID="4b0386f421398abfcce82612a18baa1f10211504d81960f450d73cc421a70798" exitCode=0 Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.254151 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerDied","Data":"4b0386f421398abfcce82612a18baa1f10211504d81960f450d73cc421a70798"} Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.254179 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"44a71bcd-3178-4394-8031-673c93a6981e","Type":"ContainerDied","Data":"aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351"} Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.254190 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef77d536618a3528727af633c6130c53ce260a15559e2acd7dc70eb20f62351" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.259620 4681 generic.go:334] "Generic (PLEG): container finished" podID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerID="6924af7e009ce21c3779524f061005b2d457d3c14b2242e4ae72a1082282a1db" exitCode=0 Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.259672 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerDied","Data":"6924af7e009ce21c3779524f061005b2d457d3c14b2242e4ae72a1082282a1db"} Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.365704 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.495540 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.495838 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.495901 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.495942 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.495972 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496009 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496031 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496064 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496117 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6mwt\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496136 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.496177 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret\") pod \"44a71bcd-3178-4394-8031-673c93a6981e\" (UID: \"44a71bcd-3178-4394-8031-673c93a6981e\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.500470 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.501016 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.502736 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.508073 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.509860 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info" (OuterVolumeSpecName: "pod-info") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.521370 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt" (OuterVolumeSpecName: "kube-api-access-m6mwt") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "kube-api-access-m6mwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.528066 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.543631 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.557716 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data" (OuterVolumeSpecName: "config-data") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.577670 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.591480 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf" (OuterVolumeSpecName: "server-conf") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599621 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6mwt\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-kube-api-access-m6mwt\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599653 4681 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/44a71bcd-3178-4394-8031-673c93a6981e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599663 4681 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599672 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599679 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599689 4681 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599716 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599726 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44a71bcd-3178-4394-8031-673c93a6981e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599734 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.599742 4681 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/44a71bcd-3178-4394-8031-673c93a6981e-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.650643 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.660851 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "44a71bcd-3178-4394-8031-673c93a6981e" (UID: "44a71bcd-3178-4394-8031-673c93a6981e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703498 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703559 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703618 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703644 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c99dx\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703661 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703677 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703746 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703798 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703820 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703849 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.703869 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info\") pod \"c8a62bbf-000f-4b40-87e9-8dad6f714178\" (UID: \"c8a62bbf-000f-4b40-87e9-8dad6f714178\") " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.706294 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.706998 4681 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.711356 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.711477 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/44a71bcd-3178-4394-8031-673c93a6981e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.708727 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.714047 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.710388 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.726442 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.730048 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info" (OuterVolumeSpecName: "pod-info") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.732153 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.736456 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx" (OuterVolumeSpecName: "kube-api-access-c99dx") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "kube-api-access-c99dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.777848 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data" (OuterVolumeSpecName: "config-data") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813892 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813924 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813936 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c99dx\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-kube-api-access-c99dx\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813965 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813977 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813985 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.813993 4681 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8a62bbf-000f-4b40-87e9-8dad6f714178-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.814000 4681 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8a62bbf-000f-4b40-87e9-8dad6f714178-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.834729 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf" (OuterVolumeSpecName: "server-conf") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.845743 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.897570 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c8a62bbf-000f-4b40-87e9-8dad6f714178" (UID: "c8a62bbf-000f-4b40-87e9-8dad6f714178"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.916354 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.916390 4681 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8a62bbf-000f-4b40-87e9-8dad6f714178-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:27 crc kubenswrapper[4681]: I1007 17:26:27.916401 4681 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8a62bbf-000f-4b40-87e9-8dad6f714178-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.269942 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.270145 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.270166 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8a62bbf-000f-4b40-87e9-8dad6f714178","Type":"ContainerDied","Data":"88c2feb12155a70577c58a0b6248c830b305f10eccfbd872c73d354a9e85b472"} Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.271006 4681 scope.go:117] "RemoveContainer" containerID="6924af7e009ce21c3779524f061005b2d457d3c14b2242e4ae72a1082282a1db" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.305167 4681 scope.go:117] "RemoveContainer" containerID="63784714c0d4bc78a8ead0376932527f23f2511b675cec54cfcb3226d4bdd559" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.404940 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.419349 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.444592 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.447493 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.455751 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: E1007 17:26:28.456087 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="setup-container" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456105 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="setup-container" Oct 07 17:26:28 crc kubenswrapper[4681]: E1007 17:26:28.456120 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456128 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: E1007 17:26:28.456163 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456169 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: E1007 17:26:28.456188 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="setup-container" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456194 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="setup-container" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456372 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a71bcd-3178-4394-8031-673c93a6981e" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.456385 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" containerName="rabbitmq" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.457659 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.462651 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.463033 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.463707 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.463848 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-thq44" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.463994 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.464502 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.464616 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.478646 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.487859 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.487971 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.493688 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.493838 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.493865 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.494117 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6258" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.494288 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.494745 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.494789 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.498245 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637163 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b4aa12d-0e45-47d7-b279-e705aef9c323-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637206 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637231 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637252 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tsw\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-kube-api-access-t7tsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637304 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637319 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637499 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637515 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637540 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4222be9f-615b-431f-9285-c629a68426e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637563 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxdns\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-kube-api-access-jxdns\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637596 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637622 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637642 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637662 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4222be9f-615b-431f-9285-c629a68426e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637684 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637700 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637717 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637760 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637783 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637803 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637831 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b4aa12d-0e45-47d7-b279-e705aef9c323-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.637859 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.739839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.739928 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.739956 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.739976 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b4aa12d-0e45-47d7-b279-e705aef9c323-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740034 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b4aa12d-0e45-47d7-b279-e705aef9c323-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740053 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740068 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740084 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tsw\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-kube-api-access-t7tsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740110 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740124 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740138 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740154 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740173 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4222be9f-615b-431f-9285-c629a68426e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740188 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxdns\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-kube-api-access-jxdns\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740211 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740230 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740255 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740277 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4222be9f-615b-431f-9285-c629a68426e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740296 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740313 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740333 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740553 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740639 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.740815 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.741169 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.741612 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.741721 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.742621 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b4aa12d-0e45-47d7-b279-e705aef9c323-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.743228 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.743313 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.748701 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b4aa12d-0e45-47d7-b279-e705aef9c323-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.749033 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4222be9f-615b-431f-9285-c629a68426e0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.750114 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.750410 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.751265 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.751678 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4222be9f-615b-431f-9285-c629a68426e0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.753924 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b4aa12d-0e45-47d7-b279-e705aef9c323-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.756669 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4222be9f-615b-431f-9285-c629a68426e0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.758742 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.759161 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.760784 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.768415 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tsw\" (UniqueName: \"kubernetes.io/projected/4222be9f-615b-431f-9285-c629a68426e0-kube-api-access-t7tsw\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.771077 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxdns\" (UniqueName: \"kubernetes.io/projected/6b4aa12d-0e45-47d7-b279-e705aef9c323-kube-api-access-jxdns\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.792608 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4222be9f-615b-431f-9285-c629a68426e0\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.809607 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6b4aa12d-0e45-47d7-b279-e705aef9c323\") " pod="openstack/rabbitmq-server-0" Oct 07 17:26:28 crc kubenswrapper[4681]: I1007 17:26:28.816052 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 17:26:29 crc kubenswrapper[4681]: I1007 17:26:29.044773 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a71bcd-3178-4394-8031-673c93a6981e" path="/var/lib/kubelet/pods/44a71bcd-3178-4394-8031-673c93a6981e/volumes" Oct 07 17:26:29 crc kubenswrapper[4681]: I1007 17:26:29.046630 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a62bbf-000f-4b40-87e9-8dad6f714178" path="/var/lib/kubelet/pods/c8a62bbf-000f-4b40-87e9-8dad6f714178/volumes" Oct 07 17:26:29 crc kubenswrapper[4681]: I1007 17:26:29.083866 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:26:29 crc kubenswrapper[4681]: I1007 17:26:29.420073 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 17:26:29 crc kubenswrapper[4681]: I1007 17:26:29.732043 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 17:26:29 crc kubenswrapper[4681]: W1007 17:26:29.734974 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4222be9f_615b_431f_9285_c629a68426e0.slice/crio-94d58764e52cfa58ce014269d11e07c61f20bc230516c3aa9d8f42354d4b1bb1 WatchSource:0}: Error finding container 94d58764e52cfa58ce014269d11e07c61f20bc230516c3aa9d8f42354d4b1bb1: Status 404 returned error can't find the container with id 94d58764e52cfa58ce014269d11e07c61f20bc230516c3aa9d8f42354d4b1bb1 Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.310258 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b4aa12d-0e45-47d7-b279-e705aef9c323","Type":"ContainerStarted","Data":"f44ce1a5e851ffd3effe871c66232de957e00b147bdfffbe3547d5c5f0c7788d"} Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.311631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4222be9f-615b-431f-9285-c629a68426e0","Type":"ContainerStarted","Data":"94d58764e52cfa58ce014269d11e07c61f20bc230516c3aa9d8f42354d4b1bb1"} Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.497629 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.499552 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.504419 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574087 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f526p\" (UniqueName: \"kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574141 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574193 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574269 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574417 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574536 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.574600 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.576095 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676501 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f526p\" (UniqueName: \"kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676556 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676604 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676620 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676645 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676691 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.676739 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.677761 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.677770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.677839 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.677874 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.677954 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.678186 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.697266 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f526p\" (UniqueName: \"kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p\") pod \"dnsmasq-dns-d558885bc-jfpc8\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:30 crc kubenswrapper[4681]: I1007 17:26:30.825463 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:31 crc kubenswrapper[4681]: I1007 17:26:31.308469 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:31 crc kubenswrapper[4681]: I1007 17:26:31.320363 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" event={"ID":"3c5a5567-881d-40ef-8343-60c88eb7d0fe","Type":"ContainerStarted","Data":"176a4fd116e7862b0342f919214c423fd5fc377df8a70afc2a125db2bddd4ca1"} Oct 07 17:26:31 crc kubenswrapper[4681]: I1007 17:26:31.323380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b4aa12d-0e45-47d7-b279-e705aef9c323","Type":"ContainerStarted","Data":"2cdb12a0b8a4a18b9c681d92f29ecf8eee4ce67cacd19c8c5f1c73bd2f52fcc6"} Oct 07 17:26:31 crc kubenswrapper[4681]: I1007 17:26:31.326722 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4222be9f-615b-431f-9285-c629a68426e0","Type":"ContainerStarted","Data":"f7403cd123b83b167ddd82aa330e366f5a12cb66d5ae6b773c355fca298dfa2d"} Oct 07 17:26:32 crc kubenswrapper[4681]: I1007 17:26:32.337860 4681 generic.go:334] "Generic (PLEG): container finished" podID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerID="f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1" exitCode=0 Oct 07 17:26:32 crc kubenswrapper[4681]: I1007 17:26:32.337979 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" event={"ID":"3c5a5567-881d-40ef-8343-60c88eb7d0fe","Type":"ContainerDied","Data":"f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1"} Oct 07 17:26:33 crc kubenswrapper[4681]: I1007 17:26:33.348795 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" event={"ID":"3c5a5567-881d-40ef-8343-60c88eb7d0fe","Type":"ContainerStarted","Data":"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d"} Oct 07 17:26:33 crc kubenswrapper[4681]: I1007 17:26:33.349256 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:40 crc kubenswrapper[4681]: I1007 17:26:40.827265 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:40 crc kubenswrapper[4681]: I1007 17:26:40.849307 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" podStartSLOduration=10.849282428 podStartE2EDuration="10.849282428s" podCreationTimestamp="2025-10-07 17:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:26:33.373492211 +0000 UTC m=+1397.020903766" watchObservedRunningTime="2025-10-07 17:26:40.849282428 +0000 UTC m=+1404.496693993" Oct 07 17:26:40 crc kubenswrapper[4681]: I1007 17:26:40.915324 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:26:40 crc kubenswrapper[4681]: I1007 17:26:40.915725 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="dnsmasq-dns" containerID="cri-o://169b180b499eafdbab027192528b6ce922fffea81f375a56372be81f69df07b9" gracePeriod=10 Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.033899 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: connect: connection refused" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.137131 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zsjhj"] Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.138734 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.249985 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zsjhj"] Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306148 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nvj\" (UniqueName: \"kubernetes.io/projected/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-kube-api-access-m4nvj\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306350 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-config\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306402 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306428 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306465 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306481 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.306517 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.409956 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.409991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.410065 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.410161 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nvj\" (UniqueName: \"kubernetes.io/projected/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-kube-api-access-m4nvj\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.410262 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-config\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.410369 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.410416 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.411272 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.412450 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.412545 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.412955 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.413469 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-config\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.413968 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.423297 4681 generic.go:334] "Generic (PLEG): container finished" podID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerID="169b180b499eafdbab027192528b6ce922fffea81f375a56372be81f69df07b9" exitCode=0 Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.423336 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" event={"ID":"ff96470c-7e4d-4f94-8ed6-c42fb4d63928","Type":"ContainerDied","Data":"169b180b499eafdbab027192528b6ce922fffea81f375a56372be81f69df07b9"} Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.461703 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nvj\" (UniqueName: \"kubernetes.io/projected/a5b5bb10-eaaa-410b-8040-c9b15d4c0e62-kube-api-access-m4nvj\") pod \"dnsmasq-dns-67cb876dc9-zsjhj\" (UID: \"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62\") " pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.465463 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.639046 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717002 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717352 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717435 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717520 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717722 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.717775 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp94f\" (UniqueName: \"kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f\") pod \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\" (UID: \"ff96470c-7e4d-4f94-8ed6-c42fb4d63928\") " Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.723981 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f" (OuterVolumeSpecName: "kube-api-access-bp94f") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "kube-api-access-bp94f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.787156 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config" (OuterVolumeSpecName: "config") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.821527 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp94f\" (UniqueName: \"kubernetes.io/projected/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-kube-api-access-bp94f\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.821612 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.822289 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.823140 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.823402 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.827133 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff96470c-7e4d-4f94-8ed6-c42fb4d63928" (UID: "ff96470c-7e4d-4f94-8ed6-c42fb4d63928"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.923759 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.923787 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.923798 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:41 crc kubenswrapper[4681]: I1007 17:26:41.923807 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff96470c-7e4d-4f94-8ed6-c42fb4d63928-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.051182 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zsjhj"] Oct 07 17:26:42 crc kubenswrapper[4681]: W1007 17:26:42.054529 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b5bb10_eaaa_410b_8040_c9b15d4c0e62.slice/crio-f7e985aa27b29d777787c2df5dfd8177ad79e0cf737a17dc8e4a028bb4e7a491 WatchSource:0}: Error finding container f7e985aa27b29d777787c2df5dfd8177ad79e0cf737a17dc8e4a028bb4e7a491: Status 404 returned error can't find the container with id f7e985aa27b29d777787c2df5dfd8177ad79e0cf737a17dc8e4a028bb4e7a491 Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.432577 4681 generic.go:334] "Generic (PLEG): container finished" podID="a5b5bb10-eaaa-410b-8040-c9b15d4c0e62" containerID="88691f536f37a3043ef3894a5ccfa8f8d14b720324b620c5294e6b9efe41a165" exitCode=0 Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.432719 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" event={"ID":"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62","Type":"ContainerDied","Data":"88691f536f37a3043ef3894a5ccfa8f8d14b720324b620c5294e6b9efe41a165"} Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.432975 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" event={"ID":"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62","Type":"ContainerStarted","Data":"f7e985aa27b29d777787c2df5dfd8177ad79e0cf737a17dc8e4a028bb4e7a491"} Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.437071 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" event={"ID":"ff96470c-7e4d-4f94-8ed6-c42fb4d63928","Type":"ContainerDied","Data":"9585a2440f12245e377ada037621f458817a8b87eaf73b3c9ebb64a9f150e392"} Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.437120 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-jf6hm" Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.437126 4681 scope.go:117] "RemoveContainer" containerID="169b180b499eafdbab027192528b6ce922fffea81f375a56372be81f69df07b9" Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.574077 4681 scope.go:117] "RemoveContainer" containerID="bc4d6bdb24f157eacff17ab3fb85a399e1b0acbd68259d6ce3970ee795d0c46c" Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.601538 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:26:42 crc kubenswrapper[4681]: I1007 17:26:42.609855 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-jf6hm"] Oct 07 17:26:43 crc kubenswrapper[4681]: I1007 17:26:43.041999 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" path="/var/lib/kubelet/pods/ff96470c-7e4d-4f94-8ed6-c42fb4d63928/volumes" Oct 07 17:26:43 crc kubenswrapper[4681]: I1007 17:26:43.448532 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" event={"ID":"a5b5bb10-eaaa-410b-8040-c9b15d4c0e62","Type":"ContainerStarted","Data":"49c250bfc3141a77f191918a5efa9d08918bb518980bdf4bc242be9581dd7f94"} Oct 07 17:26:43 crc kubenswrapper[4681]: I1007 17:26:43.449204 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:51 crc kubenswrapper[4681]: I1007 17:26:51.468107 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" Oct 07 17:26:51 crc kubenswrapper[4681]: I1007 17:26:51.487063 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67cb876dc9-zsjhj" podStartSLOduration=10.487048715 podStartE2EDuration="10.487048715s" podCreationTimestamp="2025-10-07 17:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:26:43.474740417 +0000 UTC m=+1407.122151972" watchObservedRunningTime="2025-10-07 17:26:51.487048715 +0000 UTC m=+1415.134460270" Oct 07 17:26:51 crc kubenswrapper[4681]: I1007 17:26:51.551009 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:51 crc kubenswrapper[4681]: I1007 17:26:51.551233 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="dnsmasq-dns" containerID="cri-o://a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d" gracePeriod=10 Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.038316 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.210629 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f526p\" (UniqueName: \"kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.210747 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.211702 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.211757 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.211800 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.212193 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.212235 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam\") pod \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\" (UID: \"3c5a5567-881d-40ef-8343-60c88eb7d0fe\") " Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.230924 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p" (OuterVolumeSpecName: "kube-api-access-f526p") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "kube-api-access-f526p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.268002 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config" (OuterVolumeSpecName: "config") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.272245 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.286771 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.291264 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.298472 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314277 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f526p\" (UniqueName: \"kubernetes.io/projected/3c5a5567-881d-40ef-8343-60c88eb7d0fe-kube-api-access-f526p\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314315 4681 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314326 4681 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314335 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-config\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314344 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.314351 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.322429 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c5a5567-881d-40ef-8343-60c88eb7d0fe" (UID: "3c5a5567-881d-40ef-8343-60c88eb7d0fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.415809 4681 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c5a5567-881d-40ef-8343-60c88eb7d0fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.546233 4681 generic.go:334] "Generic (PLEG): container finished" podID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerID="a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d" exitCode=0 Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.546289 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" event={"ID":"3c5a5567-881d-40ef-8343-60c88eb7d0fe","Type":"ContainerDied","Data":"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d"} Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.546298 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.546325 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-jfpc8" event={"ID":"3c5a5567-881d-40ef-8343-60c88eb7d0fe","Type":"ContainerDied","Data":"176a4fd116e7862b0342f919214c423fd5fc377df8a70afc2a125db2bddd4ca1"} Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.546344 4681 scope.go:117] "RemoveContainer" containerID="a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.570357 4681 scope.go:117] "RemoveContainer" containerID="f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.587053 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.594166 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-jfpc8"] Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.612415 4681 scope.go:117] "RemoveContainer" containerID="a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d" Oct 07 17:26:52 crc kubenswrapper[4681]: E1007 17:26:52.613035 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d\": container with ID starting with a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d not found: ID does not exist" containerID="a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.613086 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d"} err="failed to get container status \"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d\": rpc error: code = NotFound desc = could not find container \"a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d\": container with ID starting with a10d1b50b9519e37bb82e27890c9997c502f2e6964b60f844d8252f7b254da5d not found: ID does not exist" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.613121 4681 scope.go:117] "RemoveContainer" containerID="f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1" Oct 07 17:26:52 crc kubenswrapper[4681]: E1007 17:26:52.613635 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1\": container with ID starting with f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1 not found: ID does not exist" containerID="f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1" Oct 07 17:26:52 crc kubenswrapper[4681]: I1007 17:26:52.613657 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1"} err="failed to get container status \"f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1\": rpc error: code = NotFound desc = could not find container \"f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1\": container with ID starting with f92c2a1bca06ac5ea9450cf3471b9f6e11207ebbe1e477bfaab38fc2d49300e1 not found: ID does not exist" Oct 07 17:26:53 crc kubenswrapper[4681]: I1007 17:26:53.038607 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" path="/var/lib/kubelet/pods/3c5a5567-881d-40ef-8343-60c88eb7d0fe/volumes" Oct 07 17:27:03 crc kubenswrapper[4681]: I1007 17:27:03.654501 4681 generic.go:334] "Generic (PLEG): container finished" podID="6b4aa12d-0e45-47d7-b279-e705aef9c323" containerID="2cdb12a0b8a4a18b9c681d92f29ecf8eee4ce67cacd19c8c5f1c73bd2f52fcc6" exitCode=0 Oct 07 17:27:03 crc kubenswrapper[4681]: I1007 17:27:03.654638 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b4aa12d-0e45-47d7-b279-e705aef9c323","Type":"ContainerDied","Data":"2cdb12a0b8a4a18b9c681d92f29ecf8eee4ce67cacd19c8c5f1c73bd2f52fcc6"} Oct 07 17:27:03 crc kubenswrapper[4681]: I1007 17:27:03.659160 4681 generic.go:334] "Generic (PLEG): container finished" podID="4222be9f-615b-431f-9285-c629a68426e0" containerID="f7403cd123b83b167ddd82aa330e366f5a12cb66d5ae6b773c355fca298dfa2d" exitCode=0 Oct 07 17:27:03 crc kubenswrapper[4681]: I1007 17:27:03.659231 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4222be9f-615b-431f-9285-c629a68426e0","Type":"ContainerDied","Data":"f7403cd123b83b167ddd82aa330e366f5a12cb66d5ae6b773c355fca298dfa2d"} Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.669119 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b4aa12d-0e45-47d7-b279-e705aef9c323","Type":"ContainerStarted","Data":"604de2faebdf5451ab7933a51a73b7f7ca99c3bf8c93dea7e1da53d2c61e32fa"} Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.671359 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.671608 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4222be9f-615b-431f-9285-c629a68426e0","Type":"ContainerStarted","Data":"7293e8bfca905587e894bb2799492d4489d2a96865e516cfa3964709e27dc5d0"} Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.672923 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.712258 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.712242724 podStartE2EDuration="36.712242724s" podCreationTimestamp="2025-10-07 17:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:27:04.709769756 +0000 UTC m=+1428.357181321" watchObservedRunningTime="2025-10-07 17:27:04.712242724 +0000 UTC m=+1428.359654269" Oct 07 17:27:04 crc kubenswrapper[4681]: I1007 17:27:04.750903 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.750889479 podStartE2EDuration="36.750889479s" podCreationTimestamp="2025-10-07 17:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:27:04.750782576 +0000 UTC m=+1428.398194131" watchObservedRunningTime="2025-10-07 17:27:04.750889479 +0000 UTC m=+1428.398301034" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.119211 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj"] Oct 07 17:27:15 crc kubenswrapper[4681]: E1007 17:27:15.120777 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.120800 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: E1007 17:27:15.120822 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.120830 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: E1007 17:27:15.120868 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="init" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.120905 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="init" Oct 07 17:27:15 crc kubenswrapper[4681]: E1007 17:27:15.120952 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="init" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.120958 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="init" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.121213 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff96470c-7e4d-4f94-8ed6-c42fb4d63928" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.121246 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a5567-881d-40ef-8343-60c88eb7d0fe" containerName="dnsmasq-dns" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.122205 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.124581 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.126328 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.129194 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.133376 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.144478 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj"] Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.265730 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngpsm\" (UniqueName: \"kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.265776 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.266014 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.266115 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.367690 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.367769 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.367795 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngpsm\" (UniqueName: \"kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.367818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.373652 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.374202 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.384414 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.389163 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngpsm\" (UniqueName: \"kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:15 crc kubenswrapper[4681]: I1007 17:27:15.443064 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:27:16 crc kubenswrapper[4681]: I1007 17:27:16.067100 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj"] Oct 07 17:27:16 crc kubenswrapper[4681]: I1007 17:27:16.779053 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" event={"ID":"a1d74e17-5142-40f0-9847-0f9ee5e33f90","Type":"ContainerStarted","Data":"f2e81e28fd511d21744d3e3817a1c3edb4db3c6a99268657e6a1bbdbb7786c58"} Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.542681 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.547833 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.559228 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.735750 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.735865 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.735958 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z772t\" (UniqueName: \"kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.819082 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.838371 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.838471 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.838536 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z772t\" (UniqueName: \"kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.839641 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.839963 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.871011 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z772t\" (UniqueName: \"kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t\") pod \"community-operators-qzwrf\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:18 crc kubenswrapper[4681]: I1007 17:27:18.885380 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:19 crc kubenswrapper[4681]: I1007 17:27:19.092985 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 17:27:19 crc kubenswrapper[4681]: W1007 17:27:19.591029 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e4d622_c9ad_4f56_95f7_c9f8be1e78df.slice/crio-2db835d1deaf80b5fb9dcca4bdb0ff3f4083bb15f39399e7bdf79d17f1fe44ed WatchSource:0}: Error finding container 2db835d1deaf80b5fb9dcca4bdb0ff3f4083bb15f39399e7bdf79d17f1fe44ed: Status 404 returned error can't find the container with id 2db835d1deaf80b5fb9dcca4bdb0ff3f4083bb15f39399e7bdf79d17f1fe44ed Oct 07 17:27:19 crc kubenswrapper[4681]: I1007 17:27:19.591513 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:19 crc kubenswrapper[4681]: I1007 17:27:19.810931 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerStarted","Data":"2db835d1deaf80b5fb9dcca4bdb0ff3f4083bb15f39399e7bdf79d17f1fe44ed"} Oct 07 17:27:20 crc kubenswrapper[4681]: I1007 17:27:20.824805 4681 generic.go:334] "Generic (PLEG): container finished" podID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerID="e22269b46256d189bc6b83fab3423383221f46e2f2e5d0bf68197854e11996df" exitCode=0 Oct 07 17:27:20 crc kubenswrapper[4681]: I1007 17:27:20.824859 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerDied","Data":"e22269b46256d189bc6b83fab3423383221f46e2f2e5d0bf68197854e11996df"} Oct 07 17:27:22 crc kubenswrapper[4681]: I1007 17:27:22.859043 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerStarted","Data":"d803f665df9e2b9ff3fa667a81ed9778da8cd3ee97bb18c515d86aa27395c15d"} Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.589771 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.593000 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.598378 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.665826 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.666007 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmvmj\" (UniqueName: \"kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.666062 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.770337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmvmj\" (UniqueName: \"kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.770444 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.770632 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.771690 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.771747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.793978 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmvmj\" (UniqueName: \"kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj\") pod \"certified-operators-rqdtk\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.882190 4681 generic.go:334] "Generic (PLEG): container finished" podID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerID="d803f665df9e2b9ff3fa667a81ed9778da8cd3ee97bb18c515d86aa27395c15d" exitCode=0 Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.882234 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerDied","Data":"d803f665df9e2b9ff3fa667a81ed9778da8cd3ee97bb18c515d86aa27395c15d"} Oct 07 17:27:24 crc kubenswrapper[4681]: I1007 17:27:24.922332 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:33 crc kubenswrapper[4681]: E1007 17:27:33.392541 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:7f7b37ee390403faf0b3b118ad27db355e307937" Oct 07 17:27:33 crc kubenswrapper[4681]: E1007 17:27:33.393082 4681 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:7f7b37ee390403faf0b3b118ad27db355e307937" Oct 07 17:27:33 crc kubenswrapper[4681]: E1007 17:27:33.393218 4681 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 07 17:27:33 crc kubenswrapper[4681]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:7f7b37ee390403faf0b3b118ad27db355e307937,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Oct 07 17:27:33 crc kubenswrapper[4681]: - hosts: all Oct 07 17:27:33 crc kubenswrapper[4681]: strategy: linear Oct 07 17:27:33 crc kubenswrapper[4681]: tasks: Oct 07 17:27:33 crc kubenswrapper[4681]: - name: Enable podified-repos Oct 07 17:27:33 crc kubenswrapper[4681]: become: true Oct 07 17:27:33 crc kubenswrapper[4681]: ansible.builtin.shell: | Oct 07 17:27:33 crc kubenswrapper[4681]: set -euxo pipefail Oct 07 17:27:33 crc kubenswrapper[4681]: pushd /var/tmp Oct 07 17:27:33 crc kubenswrapper[4681]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Oct 07 17:27:33 crc kubenswrapper[4681]: pushd repo-setup-main Oct 07 17:27:33 crc kubenswrapper[4681]: python3 -m venv ./venv Oct 07 17:27:33 crc kubenswrapper[4681]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Oct 07 17:27:33 crc kubenswrapper[4681]: ./venv/bin/repo-setup current-podified -b antelope Oct 07 17:27:33 crc kubenswrapper[4681]: popd Oct 07 17:27:33 crc kubenswrapper[4681]: rm -rf repo-setup-main Oct 07 17:27:33 crc kubenswrapper[4681]: Oct 07 17:27:33 crc kubenswrapper[4681]: Oct 07 17:27:33 crc kubenswrapper[4681]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Oct 07 17:27:33 crc kubenswrapper[4681]: edpm_override_hosts: openstack-edpm-ipam Oct 07 17:27:33 crc kubenswrapper[4681]: edpm_service_type: repo-setup Oct 07 17:27:33 crc kubenswrapper[4681]: Oct 07 17:27:33 crc kubenswrapper[4681]: Oct 07 17:27:33 crc kubenswrapper[4681]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngpsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj_openstack(a1d74e17-5142-40f0-9847-0f9ee5e33f90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 07 17:27:33 crc kubenswrapper[4681]: > logger="UnhandledError" Oct 07 17:27:33 crc kubenswrapper[4681]: E1007 17:27:33.394428 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" podUID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.701529 4681 scope.go:117] "RemoveContainer" containerID="4b0386f421398abfcce82612a18baa1f10211504d81960f450d73cc421a70798" Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.721680 4681 scope.go:117] "RemoveContainer" containerID="71834050946ec519427085e9d34a98bce4e7f3aa477d5ac163e4c02b21a71def" Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.757970 4681 scope.go:117] "RemoveContainer" containerID="684797f48d112b934082488598634aba95ab60ac31835fe41f631c9666e298a5" Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.817143 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:33 crc kubenswrapper[4681]: W1007 17:27:33.842796 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edd6c28_9cf6_4669_acff_8c53930a4342.slice/crio-b14271d9d393166e42546783692361830c57b76057855b850926bef101dba14e WatchSource:0}: Error finding container b14271d9d393166e42546783692361830c57b76057855b850926bef101dba14e: Status 404 returned error can't find the container with id b14271d9d393166e42546783692361830c57b76057855b850926bef101dba14e Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.984790 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerStarted","Data":"c5e46eb3e8024430b452cdc83c2e2150bb5a34ace25d01d29302b934d485dcc4"} Oct 07 17:27:33 crc kubenswrapper[4681]: I1007 17:27:33.993527 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerStarted","Data":"b14271d9d393166e42546783692361830c57b76057855b850926bef101dba14e"} Oct 07 17:27:33 crc kubenswrapper[4681]: E1007 17:27:33.994242 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/openstack-k8s-operators/openstack-ansibleee-runner:7f7b37ee390403faf0b3b118ad27db355e307937\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" podUID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" Oct 07 17:27:34 crc kubenswrapper[4681]: I1007 17:27:34.010400 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qzwrf" podStartSLOduration=3.423073976 podStartE2EDuration="16.010261828s" podCreationTimestamp="2025-10-07 17:27:18 +0000 UTC" firstStartedPulling="2025-10-07 17:27:20.829096672 +0000 UTC m=+1444.476508227" lastFinishedPulling="2025-10-07 17:27:33.416284524 +0000 UTC m=+1457.063696079" observedRunningTime="2025-10-07 17:27:34.008165391 +0000 UTC m=+1457.655576946" watchObservedRunningTime="2025-10-07 17:27:34.010261828 +0000 UTC m=+1457.657673383" Oct 07 17:27:35 crc kubenswrapper[4681]: I1007 17:27:35.003322 4681 generic.go:334] "Generic (PLEG): container finished" podID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerID="2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c" exitCode=0 Oct 07 17:27:35 crc kubenswrapper[4681]: I1007 17:27:35.003406 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerDied","Data":"2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c"} Oct 07 17:27:37 crc kubenswrapper[4681]: I1007 17:27:37.027002 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerStarted","Data":"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354"} Oct 07 17:27:38 crc kubenswrapper[4681]: I1007 17:27:38.885803 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:38 crc kubenswrapper[4681]: I1007 17:27:38.886186 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:39 crc kubenswrapper[4681]: I1007 17:27:39.044787 4681 generic.go:334] "Generic (PLEG): container finished" podID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerID="4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354" exitCode=0 Oct 07 17:27:39 crc kubenswrapper[4681]: I1007 17:27:39.044899 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerDied","Data":"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354"} Oct 07 17:27:39 crc kubenswrapper[4681]: I1007 17:27:39.941169 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qzwrf" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="registry-server" probeResult="failure" output=< Oct 07 17:27:39 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:27:39 crc kubenswrapper[4681]: > Oct 07 17:27:40 crc kubenswrapper[4681]: I1007 17:27:40.057914 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerStarted","Data":"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925"} Oct 07 17:27:40 crc kubenswrapper[4681]: I1007 17:27:40.085291 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqdtk" podStartSLOduration=11.569885612 podStartE2EDuration="16.085273827s" podCreationTimestamp="2025-10-07 17:27:24 +0000 UTC" firstStartedPulling="2025-10-07 17:27:35.005640476 +0000 UTC m=+1458.653052031" lastFinishedPulling="2025-10-07 17:27:39.521028691 +0000 UTC m=+1463.168440246" observedRunningTime="2025-10-07 17:27:40.07739873 +0000 UTC m=+1463.724810305" watchObservedRunningTime="2025-10-07 17:27:40.085273827 +0000 UTC m=+1463.732685382" Oct 07 17:27:42 crc kubenswrapper[4681]: I1007 17:27:42.195205 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:27:42 crc kubenswrapper[4681]: I1007 17:27:42.195521 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:27:44 crc kubenswrapper[4681]: I1007 17:27:44.922933 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:44 crc kubenswrapper[4681]: I1007 17:27:44.923393 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:45 crc kubenswrapper[4681]: I1007 17:27:45.963631 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rqdtk" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="registry-server" probeResult="failure" output=< Oct 07 17:27:45 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:27:45 crc kubenswrapper[4681]: > Oct 07 17:27:48 crc kubenswrapper[4681]: I1007 17:27:48.213853 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:27:48 crc kubenswrapper[4681]: I1007 17:27:48.946837 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:48 crc kubenswrapper[4681]: I1007 17:27:48.996303 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:49 crc kubenswrapper[4681]: I1007 17:27:49.135350 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" event={"ID":"a1d74e17-5142-40f0-9847-0f9ee5e33f90","Type":"ContainerStarted","Data":"dd39062eb6c3718ca32455c2951b10450478e88e3c1abb0fb9c47c284678adc5"} Oct 07 17:27:49 crc kubenswrapper[4681]: I1007 17:27:49.157145 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" podStartSLOduration=2.010230244 podStartE2EDuration="34.157129439s" podCreationTimestamp="2025-10-07 17:27:15 +0000 UTC" firstStartedPulling="2025-10-07 17:27:16.064664579 +0000 UTC m=+1439.712076134" lastFinishedPulling="2025-10-07 17:27:48.211563774 +0000 UTC m=+1471.858975329" observedRunningTime="2025-10-07 17:27:49.152242315 +0000 UTC m=+1472.799653870" watchObservedRunningTime="2025-10-07 17:27:49.157129439 +0000 UTC m=+1472.804540994" Oct 07 17:27:49 crc kubenswrapper[4681]: I1007 17:27:49.739767 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:50 crc kubenswrapper[4681]: I1007 17:27:50.141454 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qzwrf" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="registry-server" containerID="cri-o://c5e46eb3e8024430b452cdc83c2e2150bb5a34ace25d01d29302b934d485dcc4" gracePeriod=2 Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.151371 4681 generic.go:334] "Generic (PLEG): container finished" podID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerID="c5e46eb3e8024430b452cdc83c2e2150bb5a34ace25d01d29302b934d485dcc4" exitCode=0 Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.151712 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerDied","Data":"c5e46eb3e8024430b452cdc83c2e2150bb5a34ace25d01d29302b934d485dcc4"} Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.248189 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.326618 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z772t\" (UniqueName: \"kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t\") pod \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.326714 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities\") pod \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.326740 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content\") pod \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\" (UID: \"96e4d622-c9ad-4f56-95f7-c9f8be1e78df\") " Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.327314 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities" (OuterVolumeSpecName: "utilities") pod "96e4d622-c9ad-4f56-95f7-c9f8be1e78df" (UID: "96e4d622-c9ad-4f56-95f7-c9f8be1e78df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.332275 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t" (OuterVolumeSpecName: "kube-api-access-z772t") pod "96e4d622-c9ad-4f56-95f7-c9f8be1e78df" (UID: "96e4d622-c9ad-4f56-95f7-c9f8be1e78df"). InnerVolumeSpecName "kube-api-access-z772t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.415546 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96e4d622-c9ad-4f56-95f7-c9f8be1e78df" (UID: "96e4d622-c9ad-4f56-95f7-c9f8be1e78df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.428089 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z772t\" (UniqueName: \"kubernetes.io/projected/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-kube-api-access-z772t\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.428123 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:51 crc kubenswrapper[4681]: I1007 17:27:51.428133 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e4d622-c9ad-4f56-95f7-c9f8be1e78df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.163277 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzwrf" event={"ID":"96e4d622-c9ad-4f56-95f7-c9f8be1e78df","Type":"ContainerDied","Data":"2db835d1deaf80b5fb9dcca4bdb0ff3f4083bb15f39399e7bdf79d17f1fe44ed"} Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.163350 4681 scope.go:117] "RemoveContainer" containerID="c5e46eb3e8024430b452cdc83c2e2150bb5a34ace25d01d29302b934d485dcc4" Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.163550 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzwrf" Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.193279 4681 scope.go:117] "RemoveContainer" containerID="d803f665df9e2b9ff3fa667a81ed9778da8cd3ee97bb18c515d86aa27395c15d" Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.221755 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.229691 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qzwrf"] Oct 07 17:27:52 crc kubenswrapper[4681]: I1007 17:27:52.240781 4681 scope.go:117] "RemoveContainer" containerID="e22269b46256d189bc6b83fab3423383221f46e2f2e5d0bf68197854e11996df" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.040287 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" path="/var/lib/kubelet/pods/96e4d622-c9ad-4f56-95f7-c9f8be1e78df/volumes" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.965640 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2vsqz"] Oct 07 17:27:53 crc kubenswrapper[4681]: E1007 17:27:53.966563 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="extract-utilities" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.966582 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="extract-utilities" Oct 07 17:27:53 crc kubenswrapper[4681]: E1007 17:27:53.966602 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="extract-content" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.966610 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="extract-content" Oct 07 17:27:53 crc kubenswrapper[4681]: E1007 17:27:53.966655 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="registry-server" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.966664 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="registry-server" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.966987 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e4d622-c9ad-4f56-95f7-c9f8be1e78df" containerName="registry-server" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.968749 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.976661 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8s4s\" (UniqueName: \"kubernetes.io/projected/9cc90449-f49f-4406-8af2-882d7e19b3f4-kube-api-access-v8s4s\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.976715 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-catalog-content\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.976798 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-utilities\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:53 crc kubenswrapper[4681]: I1007 17:27:53.988487 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsqz"] Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.078108 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-catalog-content\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.078203 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-utilities\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.078326 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8s4s\" (UniqueName: \"kubernetes.io/projected/9cc90449-f49f-4406-8af2-882d7e19b3f4-kube-api-access-v8s4s\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.079083 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-utilities\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.079570 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc90449-f49f-4406-8af2-882d7e19b3f4-catalog-content\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.099072 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8s4s\" (UniqueName: \"kubernetes.io/projected/9cc90449-f49f-4406-8af2-882d7e19b3f4-kube-api-access-v8s4s\") pod \"redhat-operators-2vsqz\" (UID: \"9cc90449-f49f-4406-8af2-882d7e19b3f4\") " pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.293657 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.780949 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsqz"] Oct 07 17:27:54 crc kubenswrapper[4681]: I1007 17:27:54.974478 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:55 crc kubenswrapper[4681]: I1007 17:27:55.040569 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:55 crc kubenswrapper[4681]: I1007 17:27:55.192382 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsqz" event={"ID":"9cc90449-f49f-4406-8af2-882d7e19b3f4","Type":"ContainerStarted","Data":"2a20dcbad969123bfaf85354faef8a156c38b71d18b923b7a00758547d614a35"} Oct 07 17:27:56 crc kubenswrapper[4681]: I1007 17:27:56.202990 4681 generic.go:334] "Generic (PLEG): container finished" podID="9cc90449-f49f-4406-8af2-882d7e19b3f4" containerID="8b827c1fab2f05d7d38335d29438b6fcfdc6977ab41e88e955adae38983f7a5a" exitCode=0 Oct 07 17:27:56 crc kubenswrapper[4681]: I1007 17:27:56.203030 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsqz" event={"ID":"9cc90449-f49f-4406-8af2-882d7e19b3f4","Type":"ContainerDied","Data":"8b827c1fab2f05d7d38335d29438b6fcfdc6977ab41e88e955adae38983f7a5a"} Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.337476 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.338052 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqdtk" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="registry-server" containerID="cri-o://4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925" gracePeriod=2 Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.858075 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.973531 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities\") pod \"1edd6c28-9cf6-4669-acff-8c53930a4342\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.973640 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content\") pod \"1edd6c28-9cf6-4669-acff-8c53930a4342\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.973679 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmvmj\" (UniqueName: \"kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj\") pod \"1edd6c28-9cf6-4669-acff-8c53930a4342\" (UID: \"1edd6c28-9cf6-4669-acff-8c53930a4342\") " Oct 07 17:27:58 crc kubenswrapper[4681]: I1007 17:27:58.974373 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities" (OuterVolumeSpecName: "utilities") pod "1edd6c28-9cf6-4669-acff-8c53930a4342" (UID: "1edd6c28-9cf6-4669-acff-8c53930a4342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.025229 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1edd6c28-9cf6-4669-acff-8c53930a4342" (UID: "1edd6c28-9cf6-4669-acff-8c53930a4342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.080301 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.081103 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1edd6c28-9cf6-4669-acff-8c53930a4342-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.083469 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj" (OuterVolumeSpecName: "kube-api-access-rmvmj") pod "1edd6c28-9cf6-4669-acff-8c53930a4342" (UID: "1edd6c28-9cf6-4669-acff-8c53930a4342"). InnerVolumeSpecName "kube-api-access-rmvmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.184823 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmvmj\" (UniqueName: \"kubernetes.io/projected/1edd6c28-9cf6-4669-acff-8c53930a4342-kube-api-access-rmvmj\") on node \"crc\" DevicePath \"\"" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.232775 4681 generic.go:334] "Generic (PLEG): container finished" podID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerID="4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925" exitCode=0 Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.232818 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerDied","Data":"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925"} Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.232846 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqdtk" event={"ID":"1edd6c28-9cf6-4669-acff-8c53930a4342","Type":"ContainerDied","Data":"b14271d9d393166e42546783692361830c57b76057855b850926bef101dba14e"} Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.232864 4681 scope.go:117] "RemoveContainer" containerID="4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.232859 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqdtk" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.253910 4681 scope.go:117] "RemoveContainer" containerID="4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.282742 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.294286 4681 scope.go:117] "RemoveContainer" containerID="2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.300666 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqdtk"] Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.329345 4681 scope.go:117] "RemoveContainer" containerID="4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925" Oct 07 17:27:59 crc kubenswrapper[4681]: E1007 17:27:59.330421 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925\": container with ID starting with 4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925 not found: ID does not exist" containerID="4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.330458 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925"} err="failed to get container status \"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925\": rpc error: code = NotFound desc = could not find container \"4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925\": container with ID starting with 4c2fcd3e18dc5c012666d34f8e0d0817b9dd9d684107f835925e0d935fbc5925 not found: ID does not exist" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.330484 4681 scope.go:117] "RemoveContainer" containerID="4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354" Oct 07 17:27:59 crc kubenswrapper[4681]: E1007 17:27:59.330955 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354\": container with ID starting with 4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354 not found: ID does not exist" containerID="4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.330978 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354"} err="failed to get container status \"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354\": rpc error: code = NotFound desc = could not find container \"4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354\": container with ID starting with 4e7727cc78c87af4074a8b7907f95f86e6d843f0cb869b915ae7771ab259b354 not found: ID does not exist" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.330995 4681 scope.go:117] "RemoveContainer" containerID="2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c" Oct 07 17:27:59 crc kubenswrapper[4681]: E1007 17:27:59.331490 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c\": container with ID starting with 2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c not found: ID does not exist" containerID="2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c" Oct 07 17:27:59 crc kubenswrapper[4681]: I1007 17:27:59.331517 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c"} err="failed to get container status \"2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c\": rpc error: code = NotFound desc = could not find container \"2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c\": container with ID starting with 2043898f68be55672495fa8a12c395f0daf5ac4fe4c8c2633e193848be5bb08c not found: ID does not exist" Oct 07 17:28:01 crc kubenswrapper[4681]: I1007 17:28:01.039688 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" path="/var/lib/kubelet/pods/1edd6c28-9cf6-4669-acff-8c53930a4342/volumes" Oct 07 17:28:03 crc kubenswrapper[4681]: I1007 17:28:03.268482 4681 generic.go:334] "Generic (PLEG): container finished" podID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" containerID="dd39062eb6c3718ca32455c2951b10450478e88e3c1abb0fb9c47c284678adc5" exitCode=0 Oct 07 17:28:03 crc kubenswrapper[4681]: I1007 17:28:03.268550 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" event={"ID":"a1d74e17-5142-40f0-9847-0f9ee5e33f90","Type":"ContainerDied","Data":"dd39062eb6c3718ca32455c2951b10450478e88e3c1abb0fb9c47c284678adc5"} Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.039553 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.218368 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle\") pod \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.218437 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngpsm\" (UniqueName: \"kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm\") pod \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.218524 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key\") pod \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.218643 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory\") pod \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\" (UID: \"a1d74e17-5142-40f0-9847-0f9ee5e33f90\") " Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.224779 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a1d74e17-5142-40f0-9847-0f9ee5e33f90" (UID: "a1d74e17-5142-40f0-9847-0f9ee5e33f90"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.234193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm" (OuterVolumeSpecName: "kube-api-access-ngpsm") pod "a1d74e17-5142-40f0-9847-0f9ee5e33f90" (UID: "a1d74e17-5142-40f0-9847-0f9ee5e33f90"). InnerVolumeSpecName "kube-api-access-ngpsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.249053 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory" (OuterVolumeSpecName: "inventory") pod "a1d74e17-5142-40f0-9847-0f9ee5e33f90" (UID: "a1d74e17-5142-40f0-9847-0f9ee5e33f90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.266469 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a1d74e17-5142-40f0-9847-0f9ee5e33f90" (UID: "a1d74e17-5142-40f0-9847-0f9ee5e33f90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.301721 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" event={"ID":"a1d74e17-5142-40f0-9847-0f9ee5e33f90","Type":"ContainerDied","Data":"f2e81e28fd511d21744d3e3817a1c3edb4db3c6a99268657e6a1bbdbb7786c58"} Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.301796 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.301810 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e81e28fd511d21744d3e3817a1c3edb4db3c6a99268657e6a1bbdbb7786c58" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.321813 4681 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.321996 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngpsm\" (UniqueName: \"kubernetes.io/projected/a1d74e17-5142-40f0-9847-0f9ee5e33f90-kube-api-access-ngpsm\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.322079 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:06 crc kubenswrapper[4681]: I1007 17:28:06.322174 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d74e17-5142-40f0-9847-0f9ee5e33f90-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:06 crc kubenswrapper[4681]: E1007 17:28:06.440701 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d74e17_5142_40f0_9847_0f9ee5e33f90.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d74e17_5142_40f0_9847_0f9ee5e33f90.slice/crio-f2e81e28fd511d21744d3e3817a1c3edb4db3c6a99268657e6a1bbdbb7786c58\": RecentStats: unable to find data in memory cache]" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.165516 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls"] Oct 07 17:28:07 crc kubenswrapper[4681]: E1007 17:28:07.166016 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="extract-utilities" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166033 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="extract-utilities" Oct 07 17:28:07 crc kubenswrapper[4681]: E1007 17:28:07.166072 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="extract-content" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166081 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="extract-content" Oct 07 17:28:07 crc kubenswrapper[4681]: E1007 17:28:07.166099 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="registry-server" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166107 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="registry-server" Oct 07 17:28:07 crc kubenswrapper[4681]: E1007 17:28:07.166118 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166127 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166357 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d74e17-5142-40f0-9847-0f9ee5e33f90" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.166377 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edd6c28-9cf6-4669-acff-8c53930a4342" containerName="registry-server" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.167085 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.169697 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.171829 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.173365 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.180442 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls"] Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.182272 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.337118 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.337236 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.337404 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8hh\" (UniqueName: \"kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.438955 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8hh\" (UniqueName: \"kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.439253 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.439329 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.449815 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.450511 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.457540 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8hh\" (UniqueName: \"kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hmgls\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:07 crc kubenswrapper[4681]: I1007 17:28:07.483569 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:08 crc kubenswrapper[4681]: I1007 17:28:08.137176 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls"] Oct 07 17:28:08 crc kubenswrapper[4681]: I1007 17:28:08.319970 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" event={"ID":"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df","Type":"ContainerStarted","Data":"181c39e0e25531d963ae4cad1387ec8f24fc685c0f5256b919829ffed98b1e31"} Oct 07 17:28:12 crc kubenswrapper[4681]: I1007 17:28:12.195531 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:28:12 crc kubenswrapper[4681]: I1007 17:28:12.196046 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:28:17 crc kubenswrapper[4681]: I1007 17:28:17.002993 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:28:18 crc kubenswrapper[4681]: I1007 17:28:18.417162 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" event={"ID":"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df","Type":"ContainerStarted","Data":"f2b6be655610731d951f931a39ab923f928843fb98e48faf154362e45b939ac0"} Oct 07 17:28:18 crc kubenswrapper[4681]: I1007 17:28:18.421249 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsqz" event={"ID":"9cc90449-f49f-4406-8af2-882d7e19b3f4","Type":"ContainerStarted","Data":"aca222554cc40e0d96a3cde8d59a952146890e61f61f960cdffcf0b57aa7af1f"} Oct 07 17:28:18 crc kubenswrapper[4681]: I1007 17:28:18.431797 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" podStartSLOduration=2.587059721 podStartE2EDuration="11.431783113s" podCreationTimestamp="2025-10-07 17:28:07 +0000 UTC" firstStartedPulling="2025-10-07 17:28:08.156015392 +0000 UTC m=+1491.803426957" lastFinishedPulling="2025-10-07 17:28:17.000738784 +0000 UTC m=+1500.648150349" observedRunningTime="2025-10-07 17:28:18.430043865 +0000 UTC m=+1502.077455420" watchObservedRunningTime="2025-10-07 17:28:18.431783113 +0000 UTC m=+1502.079194668" Oct 07 17:28:26 crc kubenswrapper[4681]: I1007 17:28:26.491754 4681 generic.go:334] "Generic (PLEG): container finished" podID="9cc90449-f49f-4406-8af2-882d7e19b3f4" containerID="aca222554cc40e0d96a3cde8d59a952146890e61f61f960cdffcf0b57aa7af1f" exitCode=0 Oct 07 17:28:26 crc kubenswrapper[4681]: I1007 17:28:26.491837 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsqz" event={"ID":"9cc90449-f49f-4406-8af2-882d7e19b3f4","Type":"ContainerDied","Data":"aca222554cc40e0d96a3cde8d59a952146890e61f61f960cdffcf0b57aa7af1f"} Oct 07 17:28:27 crc kubenswrapper[4681]: I1007 17:28:27.504726 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsqz" event={"ID":"9cc90449-f49f-4406-8af2-882d7e19b3f4","Type":"ContainerStarted","Data":"e15106f1b3d6c14c51351612a20abc3b2982903b34ad75bdc0f1c2ed182a987f"} Oct 07 17:28:27 crc kubenswrapper[4681]: I1007 17:28:27.533162 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2vsqz" podStartSLOduration=3.5623247620000003 podStartE2EDuration="34.533143628s" podCreationTimestamp="2025-10-07 17:27:53 +0000 UTC" firstStartedPulling="2025-10-07 17:27:56.204354476 +0000 UTC m=+1479.851766031" lastFinishedPulling="2025-10-07 17:28:27.175173342 +0000 UTC m=+1510.822584897" observedRunningTime="2025-10-07 17:28:27.520430873 +0000 UTC m=+1511.167842428" watchObservedRunningTime="2025-10-07 17:28:27.533143628 +0000 UTC m=+1511.180555183" Oct 07 17:28:28 crc kubenswrapper[4681]: I1007 17:28:28.514482 4681 generic.go:334] "Generic (PLEG): container finished" podID="0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" containerID="f2b6be655610731d951f931a39ab923f928843fb98e48faf154362e45b939ac0" exitCode=0 Oct 07 17:28:28 crc kubenswrapper[4681]: I1007 17:28:28.514523 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" event={"ID":"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df","Type":"ContainerDied","Data":"f2b6be655610731d951f931a39ab923f928843fb98e48faf154362e45b939ac0"} Oct 07 17:28:29 crc kubenswrapper[4681]: I1007 17:28:29.930050 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.001486 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key\") pod \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.001714 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory\") pod \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.001917 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r8hh\" (UniqueName: \"kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh\") pod \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\" (UID: \"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df\") " Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.029282 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh" (OuterVolumeSpecName: "kube-api-access-2r8hh") pod "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" (UID: "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df"). InnerVolumeSpecName "kube-api-access-2r8hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.032127 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory" (OuterVolumeSpecName: "inventory") pod "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" (UID: "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.044182 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" (UID: "0cdf5aef-8c9a-4cd5-8f38-2f368fe245df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.104255 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.104288 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r8hh\" (UniqueName: \"kubernetes.io/projected/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-kube-api-access-2r8hh\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.104300 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0cdf5aef-8c9a-4cd5-8f38-2f368fe245df-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.557361 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" event={"ID":"0cdf5aef-8c9a-4cd5-8f38-2f368fe245df","Type":"ContainerDied","Data":"181c39e0e25531d963ae4cad1387ec8f24fc685c0f5256b919829ffed98b1e31"} Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.557405 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="181c39e0e25531d963ae4cad1387ec8f24fc685c0f5256b919829ffed98b1e31" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.557462 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hmgls" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.607446 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l"] Oct 07 17:28:30 crc kubenswrapper[4681]: E1007 17:28:30.607818 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.607837 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.608057 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdf5aef-8c9a-4cd5-8f38-2f368fe245df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.608718 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.615209 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l"] Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.616577 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.616695 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.616720 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.616949 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.717715 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.718049 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7dt\" (UniqueName: \"kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.718177 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.718298 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.819791 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.820567 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7dt\" (UniqueName: \"kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.820625 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.820670 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.823713 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.824208 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.825486 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.846422 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7dt\" (UniqueName: \"kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:30 crc kubenswrapper[4681]: I1007 17:28:30.943855 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:28:31 crc kubenswrapper[4681]: W1007 17:28:31.613493 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5da1ef34_103f_4687_8454_89abe7b61f54.slice/crio-9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031 WatchSource:0}: Error finding container 9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031: Status 404 returned error can't find the container with id 9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031 Oct 07 17:28:31 crc kubenswrapper[4681]: I1007 17:28:31.619548 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l"] Oct 07 17:28:32 crc kubenswrapper[4681]: I1007 17:28:32.575146 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" event={"ID":"5da1ef34-103f-4687-8454-89abe7b61f54","Type":"ContainerStarted","Data":"8c26b3edbe3399bdfc105da74d5615c7a90869b2117a49b20d0075a9775e1a6d"} Oct 07 17:28:32 crc kubenswrapper[4681]: I1007 17:28:32.575420 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" event={"ID":"5da1ef34-103f-4687-8454-89abe7b61f54","Type":"ContainerStarted","Data":"9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031"} Oct 07 17:28:32 crc kubenswrapper[4681]: I1007 17:28:32.594616 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" podStartSLOduration=2.415006701 podStartE2EDuration="2.594601972s" podCreationTimestamp="2025-10-07 17:28:30 +0000 UTC" firstStartedPulling="2025-10-07 17:28:31.615422082 +0000 UTC m=+1515.262833637" lastFinishedPulling="2025-10-07 17:28:31.795017343 +0000 UTC m=+1515.442428908" observedRunningTime="2025-10-07 17:28:32.590499188 +0000 UTC m=+1516.237910743" watchObservedRunningTime="2025-10-07 17:28:32.594601972 +0000 UTC m=+1516.242013527" Oct 07 17:28:33 crc kubenswrapper[4681]: I1007 17:28:33.887862 4681 scope.go:117] "RemoveContainer" containerID="1b2775109d58b27cc1e8dba7d1e0d90a73552d7d8cb8527b108f90e4330d5a45" Oct 07 17:28:33 crc kubenswrapper[4681]: I1007 17:28:33.912153 4681 scope.go:117] "RemoveContainer" containerID="05d6c467f6709cb3b68fb25926f26eead9e6386c7b661881059cd415af9216c2" Oct 07 17:28:33 crc kubenswrapper[4681]: I1007 17:28:33.934737 4681 scope.go:117] "RemoveContainer" containerID="589a400fada1fcb5d074a5adabfeb4644cd9cca72461253fb4b784214c38a6fd" Oct 07 17:28:33 crc kubenswrapper[4681]: I1007 17:28:33.990228 4681 scope.go:117] "RemoveContainer" containerID="fa32f5f9b62796c460b420238dfac9f3d68c3b05e3a1db483e99dcb79df5857f" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.016482 4681 scope.go:117] "RemoveContainer" containerID="423b4e74661fdafbd104dab0491c59e1721cee46a46b8a99177f3567d0141b41" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.294394 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.294742 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.367103 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.640018 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2vsqz" Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.758105 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsqz"] Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.827457 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:28:34 crc kubenswrapper[4681]: I1007 17:28:34.827812 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwnmb" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="registry-server" containerID="cri-o://d2b30eb544c2f07955373dc86a1233ab05ac7b2025a12d3a0758be6d10f9d8ea" gracePeriod=2 Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.621358 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwnmb_85386158-eea6-47d2-bd74-d43e0058715f/registry-server/0.log" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.625273 4681 generic.go:334] "Generic (PLEG): container finished" podID="85386158-eea6-47d2-bd74-d43e0058715f" containerID="d2b30eb544c2f07955373dc86a1233ab05ac7b2025a12d3a0758be6d10f9d8ea" exitCode=137 Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.625312 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerDied","Data":"d2b30eb544c2f07955373dc86a1233ab05ac7b2025a12d3a0758be6d10f9d8ea"} Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.625337 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwnmb" event={"ID":"85386158-eea6-47d2-bd74-d43e0058715f","Type":"ContainerDied","Data":"bd6d6b8eb1ef3916bfcc55470e09f3f36259f8e6fce075af42ab516e6b577f69"} Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.625347 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6d6b8eb1ef3916bfcc55470e09f3f36259f8e6fce075af42ab516e6b577f69" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.707908 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vwnmb_85386158-eea6-47d2-bd74-d43e0058715f/registry-server/0.log" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.708544 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.842091 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9t4k\" (UniqueName: \"kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k\") pod \"85386158-eea6-47d2-bd74-d43e0058715f\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.842474 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content\") pod \"85386158-eea6-47d2-bd74-d43e0058715f\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.842687 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities\") pod \"85386158-eea6-47d2-bd74-d43e0058715f\" (UID: \"85386158-eea6-47d2-bd74-d43e0058715f\") " Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.843427 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities" (OuterVolumeSpecName: "utilities") pod "85386158-eea6-47d2-bd74-d43e0058715f" (UID: "85386158-eea6-47d2-bd74-d43e0058715f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.863573 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k" (OuterVolumeSpecName: "kube-api-access-r9t4k") pod "85386158-eea6-47d2-bd74-d43e0058715f" (UID: "85386158-eea6-47d2-bd74-d43e0058715f"). InnerVolumeSpecName "kube-api-access-r9t4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.944626 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.944664 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9t4k\" (UniqueName: \"kubernetes.io/projected/85386158-eea6-47d2-bd74-d43e0058715f-kube-api-access-r9t4k\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:37 crc kubenswrapper[4681]: I1007 17:28:37.973068 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85386158-eea6-47d2-bd74-d43e0058715f" (UID: "85386158-eea6-47d2-bd74-d43e0058715f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:28:38 crc kubenswrapper[4681]: I1007 17:28:38.046749 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85386158-eea6-47d2-bd74-d43e0058715f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:28:38 crc kubenswrapper[4681]: I1007 17:28:38.632026 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwnmb" Oct 07 17:28:38 crc kubenswrapper[4681]: I1007 17:28:38.665925 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:28:38 crc kubenswrapper[4681]: I1007 17:28:38.676222 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwnmb"] Oct 07 17:28:39 crc kubenswrapper[4681]: I1007 17:28:39.048173 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85386158-eea6-47d2-bd74-d43e0058715f" path="/var/lib/kubelet/pods/85386158-eea6-47d2-bd74-d43e0058715f/volumes" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.195180 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.195541 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.195591 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.196322 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.196383 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" gracePeriod=600 Oct 07 17:28:42 crc kubenswrapper[4681]: E1007 17:28:42.380668 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.669636 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" exitCode=0 Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.669685 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0"} Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.669723 4681 scope.go:117] "RemoveContainer" containerID="78c5b31222deba1f8fdd3bf8fee1a2d7ac203687a55423d769012061ba951cb8" Oct 07 17:28:42 crc kubenswrapper[4681]: I1007 17:28:42.670460 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:28:42 crc kubenswrapper[4681]: E1007 17:28:42.670773 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:28:55 crc kubenswrapper[4681]: I1007 17:28:55.029756 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:28:55 crc kubenswrapper[4681]: E1007 17:28:55.030609 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:29:06 crc kubenswrapper[4681]: I1007 17:29:06.030283 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:29:06 crc kubenswrapper[4681]: E1007 17:29:06.031108 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:29:20 crc kubenswrapper[4681]: I1007 17:29:20.030589 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:29:20 crc kubenswrapper[4681]: E1007 17:29:20.031392 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:29:34 crc kubenswrapper[4681]: I1007 17:29:34.031233 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:29:34 crc kubenswrapper[4681]: E1007 17:29:34.032416 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:29:34 crc kubenswrapper[4681]: I1007 17:29:34.172052 4681 scope.go:117] "RemoveContainer" containerID="bc189d7f1f0fea5b840abb631dde86d33ca57a89d080e18db348b76100f174fe" Oct 07 17:29:34 crc kubenswrapper[4681]: I1007 17:29:34.208129 4681 scope.go:117] "RemoveContainer" containerID="d2b30eb544c2f07955373dc86a1233ab05ac7b2025a12d3a0758be6d10f9d8ea" Oct 07 17:29:34 crc kubenswrapper[4681]: I1007 17:29:34.245703 4681 scope.go:117] "RemoveContainer" containerID="131b139777266861bc6ddc7e1ddb639f8b9bd81d0de708cedc933b014baa03b4" Oct 07 17:29:46 crc kubenswrapper[4681]: I1007 17:29:46.029749 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:29:46 crc kubenswrapper[4681]: E1007 17:29:46.030477 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:29:57 crc kubenswrapper[4681]: I1007 17:29:57.040473 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:29:57 crc kubenswrapper[4681]: E1007 17:29:57.041389 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.143888 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w"] Oct 07 17:30:00 crc kubenswrapper[4681]: E1007 17:30:00.144754 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="registry-server" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.144765 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="registry-server" Oct 07 17:30:00 crc kubenswrapper[4681]: E1007 17:30:00.144791 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="extract-content" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.144796 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="extract-content" Oct 07 17:30:00 crc kubenswrapper[4681]: E1007 17:30:00.144819 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="extract-utilities" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.144825 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="extract-utilities" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.145015 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="85386158-eea6-47d2-bd74-d43e0058715f" containerName="registry-server" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.145672 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.147599 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.151032 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.155675 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w"] Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.212811 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.212941 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.213010 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjr8n\" (UniqueName: \"kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.314314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjr8n\" (UniqueName: \"kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.314413 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.314495 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.315580 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.333686 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.336971 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjr8n\" (UniqueName: \"kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n\") pod \"collect-profiles-29330970-dk76w\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.520208 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:00 crc kubenswrapper[4681]: I1007 17:30:00.982233 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w"] Oct 07 17:30:01 crc kubenswrapper[4681]: I1007 17:30:01.383751 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" event={"ID":"1b3af6da-74ce-4d0c-a479-593e951996b2","Type":"ContainerStarted","Data":"16d9e238168403139820b21ab6d377cfaf8e6a5ee55dcc80f83f6f4c01c7a1d3"} Oct 07 17:30:01 crc kubenswrapper[4681]: I1007 17:30:01.383789 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" event={"ID":"1b3af6da-74ce-4d0c-a479-593e951996b2","Type":"ContainerStarted","Data":"6a99fca45df40443508a41e5808a0b664d4281da93fd53ff03ad82abf17526e6"} Oct 07 17:30:01 crc kubenswrapper[4681]: I1007 17:30:01.399838 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" podStartSLOduration=1.399816646 podStartE2EDuration="1.399816646s" podCreationTimestamp="2025-10-07 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 17:30:01.396092162 +0000 UTC m=+1605.043503737" watchObservedRunningTime="2025-10-07 17:30:01.399816646 +0000 UTC m=+1605.047228201" Oct 07 17:30:02 crc kubenswrapper[4681]: I1007 17:30:02.393972 4681 generic.go:334] "Generic (PLEG): container finished" podID="1b3af6da-74ce-4d0c-a479-593e951996b2" containerID="16d9e238168403139820b21ab6d377cfaf8e6a5ee55dcc80f83f6f4c01c7a1d3" exitCode=0 Oct 07 17:30:02 crc kubenswrapper[4681]: I1007 17:30:02.394067 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" event={"ID":"1b3af6da-74ce-4d0c-a479-593e951996b2","Type":"ContainerDied","Data":"16d9e238168403139820b21ab6d377cfaf8e6a5ee55dcc80f83f6f4c01c7a1d3"} Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.769174 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.879407 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume\") pod \"1b3af6da-74ce-4d0c-a479-593e951996b2\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.879511 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume\") pod \"1b3af6da-74ce-4d0c-a479-593e951996b2\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.879749 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjr8n\" (UniqueName: \"kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n\") pod \"1b3af6da-74ce-4d0c-a479-593e951996b2\" (UID: \"1b3af6da-74ce-4d0c-a479-593e951996b2\") " Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.880671 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b3af6da-74ce-4d0c-a479-593e951996b2" (UID: "1b3af6da-74ce-4d0c-a479-593e951996b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.885772 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n" (OuterVolumeSpecName: "kube-api-access-fjr8n") pod "1b3af6da-74ce-4d0c-a479-593e951996b2" (UID: "1b3af6da-74ce-4d0c-a479-593e951996b2"). InnerVolumeSpecName "kube-api-access-fjr8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.887185 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b3af6da-74ce-4d0c-a479-593e951996b2" (UID: "1b3af6da-74ce-4d0c-a479-593e951996b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.981956 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjr8n\" (UniqueName: \"kubernetes.io/projected/1b3af6da-74ce-4d0c-a479-593e951996b2-kube-api-access-fjr8n\") on node \"crc\" DevicePath \"\"" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.981990 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b3af6da-74ce-4d0c-a479-593e951996b2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:30:03 crc kubenswrapper[4681]: I1007 17:30:03.982002 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b3af6da-74ce-4d0c-a479-593e951996b2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:30:04 crc kubenswrapper[4681]: I1007 17:30:04.415672 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" event={"ID":"1b3af6da-74ce-4d0c-a479-593e951996b2","Type":"ContainerDied","Data":"6a99fca45df40443508a41e5808a0b664d4281da93fd53ff03ad82abf17526e6"} Oct 07 17:30:04 crc kubenswrapper[4681]: I1007 17:30:04.416068 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a99fca45df40443508a41e5808a0b664d4281da93fd53ff03ad82abf17526e6" Oct 07 17:30:04 crc kubenswrapper[4681]: I1007 17:30:04.416154 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w" Oct 07 17:30:09 crc kubenswrapper[4681]: I1007 17:30:09.030721 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:30:09 crc kubenswrapper[4681]: E1007 17:30:09.031487 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:30:20 crc kubenswrapper[4681]: I1007 17:30:20.030180 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:30:20 crc kubenswrapper[4681]: E1007 17:30:20.032359 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:30:34 crc kubenswrapper[4681]: I1007 17:30:34.318428 4681 scope.go:117] "RemoveContainer" containerID="b27cfe1a4a0bb7f071077c53c8144bc8a034567d2bdf6b1bf2ac69acd1a9b777" Oct 07 17:30:34 crc kubenswrapper[4681]: I1007 17:30:34.346376 4681 scope.go:117] "RemoveContainer" containerID="0883020ee2f473224f1f044a9492c3cb85984bd67c55050f33c7dd3292a5d23b" Oct 07 17:30:35 crc kubenswrapper[4681]: I1007 17:30:35.029079 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:30:35 crc kubenswrapper[4681]: E1007 17:30:35.029526 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:30:50 crc kubenswrapper[4681]: I1007 17:30:50.029536 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:30:50 crc kubenswrapper[4681]: E1007 17:30:50.030434 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:01 crc kubenswrapper[4681]: I1007 17:31:01.029323 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:31:01 crc kubenswrapper[4681]: E1007 17:31:01.030103 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.053564 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xrvzv"] Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.068032 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rdh4c"] Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.081543 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vl7gx"] Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.088963 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xrvzv"] Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.097666 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rdh4c"] Oct 07 17:31:11 crc kubenswrapper[4681]: I1007 17:31:11.104890 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vl7gx"] Oct 07 17:31:13 crc kubenswrapper[4681]: I1007 17:31:13.040006 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503e25f5-b736-418a-b51a-7f5f2ee82ba8" path="/var/lib/kubelet/pods/503e25f5-b736-418a-b51a-7f5f2ee82ba8/volumes" Oct 07 17:31:13 crc kubenswrapper[4681]: I1007 17:31:13.040820 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4" path="/var/lib/kubelet/pods/5473d55d-7c8b-4e5a-ad3c-0b30d31ee9b4/volumes" Oct 07 17:31:13 crc kubenswrapper[4681]: I1007 17:31:13.041805 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4fdd3f9-ef24-465f-96f0-7c09c34124b4" path="/var/lib/kubelet/pods/d4fdd3f9-ef24-465f-96f0-7c09c34124b4/volumes" Oct 07 17:31:16 crc kubenswrapper[4681]: I1007 17:31:16.028968 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:31:16 crc kubenswrapper[4681]: E1007 17:31:16.029688 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:27 crc kubenswrapper[4681]: I1007 17:31:27.041368 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-87a1-account-create-ngqhh"] Oct 07 17:31:27 crc kubenswrapper[4681]: I1007 17:31:27.048690 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-87a1-account-create-ngqhh"] Oct 07 17:31:28 crc kubenswrapper[4681]: I1007 17:31:28.036411 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f89-account-create-clbjp"] Oct 07 17:31:28 crc kubenswrapper[4681]: I1007 17:31:28.048521 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5a1b-account-create-swzjl"] Oct 07 17:31:28 crc kubenswrapper[4681]: I1007 17:31:28.057846 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5a1b-account-create-swzjl"] Oct 07 17:31:28 crc kubenswrapper[4681]: I1007 17:31:28.065190 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5f89-account-create-clbjp"] Oct 07 17:31:29 crc kubenswrapper[4681]: I1007 17:31:29.041249 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf" path="/var/lib/kubelet/pods/4ebb4c42-5825-4c0f-9cb5-ab4b915e4bbf/volumes" Oct 07 17:31:29 crc kubenswrapper[4681]: I1007 17:31:29.041844 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da275ce-2936-42ba-a43d-52605c4f5cb4" path="/var/lib/kubelet/pods/5da275ce-2936-42ba-a43d-52605c4f5cb4/volumes" Oct 07 17:31:29 crc kubenswrapper[4681]: I1007 17:31:29.042457 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72933878-17c1-4cb1-b068-1c19741adf5d" path="/var/lib/kubelet/pods/72933878-17c1-4cb1-b068-1c19741adf5d/volumes" Oct 07 17:31:30 crc kubenswrapper[4681]: I1007 17:31:30.029921 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:31:30 crc kubenswrapper[4681]: E1007 17:31:30.030455 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.433617 4681 scope.go:117] "RemoveContainer" containerID="49bc70636b15c2212e643587ae10c9c263f8fa3d0e0faa8bd87b1930dfbed9aa" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.457926 4681 scope.go:117] "RemoveContainer" containerID="0f5933db75402c07d0825cb64393d5e1b5794d85ac8fdfa5f281928333d5726d" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.499570 4681 scope.go:117] "RemoveContainer" containerID="7e52e96aa8195974c691c7d8b83a6a780971a1ce108967044f9ac2b7050c4b89" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.536208 4681 scope.go:117] "RemoveContainer" containerID="f171d9d5a3264d254de50bf4743cd673d165d1fade2b26a157194a10d64259c4" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.578226 4681 scope.go:117] "RemoveContainer" containerID="143e054bb18ef5ded36340fc337b99fae598b619a949192a70f23a89a3e01f97" Oct 07 17:31:34 crc kubenswrapper[4681]: I1007 17:31:34.630363 4681 scope.go:117] "RemoveContainer" containerID="7965398920f60875d06f0d0a3eb067e7e1fb8c8237c552fa08137e40a5387e54" Oct 07 17:31:41 crc kubenswrapper[4681]: I1007 17:31:41.030619 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:31:41 crc kubenswrapper[4681]: E1007 17:31:41.031334 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.128200 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:43 crc kubenswrapper[4681]: E1007 17:31:43.130115 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3af6da-74ce-4d0c-a479-593e951996b2" containerName="collect-profiles" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.130217 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3af6da-74ce-4d0c-a479-593e951996b2" containerName="collect-profiles" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.130553 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3af6da-74ce-4d0c-a479-593e951996b2" containerName="collect-profiles" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.132427 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.193943 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.315110 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.315215 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.315270 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsw5f\" (UniqueName: \"kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.417049 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.417347 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsw5f\" (UniqueName: \"kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.417603 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.417625 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.418195 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.449184 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsw5f\" (UniqueName: \"kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f\") pod \"redhat-marketplace-4d9mv\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.509233 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:43 crc kubenswrapper[4681]: I1007 17:31:43.960655 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:44 crc kubenswrapper[4681]: I1007 17:31:44.355195 4681 generic.go:334] "Generic (PLEG): container finished" podID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerID="409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0" exitCode=0 Oct 07 17:31:44 crc kubenswrapper[4681]: I1007 17:31:44.355476 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerDied","Data":"409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0"} Oct 07 17:31:44 crc kubenswrapper[4681]: I1007 17:31:44.355504 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerStarted","Data":"9c1c15dd3df8aea1f8bcbe7fbc562ed18c0d604b400ab7f7f6a5f71129696a14"} Oct 07 17:31:44 crc kubenswrapper[4681]: I1007 17:31:44.357981 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:31:46 crc kubenswrapper[4681]: I1007 17:31:46.373908 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerStarted","Data":"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41"} Oct 07 17:31:47 crc kubenswrapper[4681]: I1007 17:31:47.391905 4681 generic.go:334] "Generic (PLEG): container finished" podID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerID="2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41" exitCode=0 Oct 07 17:31:47 crc kubenswrapper[4681]: I1007 17:31:47.392033 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerDied","Data":"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41"} Oct 07 17:31:48 crc kubenswrapper[4681]: I1007 17:31:48.401445 4681 generic.go:334] "Generic (PLEG): container finished" podID="5da1ef34-103f-4687-8454-89abe7b61f54" containerID="8c26b3edbe3399bdfc105da74d5615c7a90869b2117a49b20d0075a9775e1a6d" exitCode=0 Oct 07 17:31:48 crc kubenswrapper[4681]: I1007 17:31:48.401528 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" event={"ID":"5da1ef34-103f-4687-8454-89abe7b61f54","Type":"ContainerDied","Data":"8c26b3edbe3399bdfc105da74d5615c7a90869b2117a49b20d0075a9775e1a6d"} Oct 07 17:31:48 crc kubenswrapper[4681]: I1007 17:31:48.403951 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerStarted","Data":"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398"} Oct 07 17:31:48 crc kubenswrapper[4681]: I1007 17:31:48.440200 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4d9mv" podStartSLOduration=1.649411374 podStartE2EDuration="5.44018264s" podCreationTimestamp="2025-10-07 17:31:43 +0000 UTC" firstStartedPulling="2025-10-07 17:31:44.357689846 +0000 UTC m=+1708.005101401" lastFinishedPulling="2025-10-07 17:31:48.148461112 +0000 UTC m=+1711.795872667" observedRunningTime="2025-10-07 17:31:48.433673429 +0000 UTC m=+1712.081085004" watchObservedRunningTime="2025-10-07 17:31:48.44018264 +0000 UTC m=+1712.087594195" Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.046807 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rhm4v"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.054911 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-xcdt5"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.068328 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rhm4v"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.077379 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-d2gsl"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.083970 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-xcdt5"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.090437 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-d2gsl"] Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.797085 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.931142 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle\") pod \"5da1ef34-103f-4687-8454-89abe7b61f54\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.931575 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory\") pod \"5da1ef34-103f-4687-8454-89abe7b61f54\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.931714 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd7dt\" (UniqueName: \"kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt\") pod \"5da1ef34-103f-4687-8454-89abe7b61f54\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.932530 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key\") pod \"5da1ef34-103f-4687-8454-89abe7b61f54\" (UID: \"5da1ef34-103f-4687-8454-89abe7b61f54\") " Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.951368 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5da1ef34-103f-4687-8454-89abe7b61f54" (UID: "5da1ef34-103f-4687-8454-89abe7b61f54"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.951436 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt" (OuterVolumeSpecName: "kube-api-access-fd7dt") pod "5da1ef34-103f-4687-8454-89abe7b61f54" (UID: "5da1ef34-103f-4687-8454-89abe7b61f54"). InnerVolumeSpecName "kube-api-access-fd7dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.960416 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory" (OuterVolumeSpecName: "inventory") pod "5da1ef34-103f-4687-8454-89abe7b61f54" (UID: "5da1ef34-103f-4687-8454-89abe7b61f54"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:31:49 crc kubenswrapper[4681]: I1007 17:31:49.960445 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5da1ef34-103f-4687-8454-89abe7b61f54" (UID: "5da1ef34-103f-4687-8454-89abe7b61f54"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.034892 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.034928 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd7dt\" (UniqueName: \"kubernetes.io/projected/5da1ef34-103f-4687-8454-89abe7b61f54-kube-api-access-fd7dt\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.034941 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.034953 4681 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5da1ef34-103f-4687-8454-89abe7b61f54-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.422432 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" event={"ID":"5da1ef34-103f-4687-8454-89abe7b61f54","Type":"ContainerDied","Data":"9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031"} Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.422475 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fc5e3dc7863ca49aee6022bdee2b120361c94ea5bfb07d19952aed2ba5de031" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.422502 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.524524 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7"] Oct 07 17:31:50 crc kubenswrapper[4681]: E1007 17:31:50.525013 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da1ef34-103f-4687-8454-89abe7b61f54" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.525035 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da1ef34-103f-4687-8454-89abe7b61f54" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.525285 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da1ef34-103f-4687-8454-89abe7b61f54" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.526066 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.531380 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.531592 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.531874 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.532021 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.576397 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7"] Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.645415 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.645479 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.645531 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xxm\" (UniqueName: \"kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.747357 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.747552 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.747667 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xxm\" (UniqueName: \"kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.751325 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.752137 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.764043 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xxm\" (UniqueName: \"kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:50 crc kubenswrapper[4681]: I1007 17:31:50.878556 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:31:51 crc kubenswrapper[4681]: I1007 17:31:51.043810 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd1c96e-3ade-4991-a129-9e12fd5837db" path="/var/lib/kubelet/pods/2fd1c96e-3ade-4991-a129-9e12fd5837db/volumes" Oct 07 17:31:51 crc kubenswrapper[4681]: I1007 17:31:51.047066 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab816f0-aaab-4889-817f-3cbe0492dfe0" path="/var/lib/kubelet/pods/6ab816f0-aaab-4889-817f-3cbe0492dfe0/volumes" Oct 07 17:31:51 crc kubenswrapper[4681]: I1007 17:31:51.050609 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5409c34-27ff-4970-a35c-5bb5ee377fb9" path="/var/lib/kubelet/pods/f5409c34-27ff-4970-a35c-5bb5ee377fb9/volumes" Oct 07 17:31:51 crc kubenswrapper[4681]: I1007 17:31:51.411615 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7"] Oct 07 17:31:51 crc kubenswrapper[4681]: I1007 17:31:51.433369 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" event={"ID":"41bd87d5-77d6-4866-b9b8-aaed777393b5","Type":"ContainerStarted","Data":"a52ef6e74e7dbe516aebd9519c37c5ac3f5f179e7b444dcb271de42e698c72bb"} Oct 07 17:31:52 crc kubenswrapper[4681]: I1007 17:31:52.029962 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:31:52 crc kubenswrapper[4681]: E1007 17:31:52.030330 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:31:52 crc kubenswrapper[4681]: I1007 17:31:52.445442 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" event={"ID":"41bd87d5-77d6-4866-b9b8-aaed777393b5","Type":"ContainerStarted","Data":"2d6149edb43d977a12dc92bb6c789df5abd50e224c9e4ea3793519f92922f8bd"} Oct 07 17:31:52 crc kubenswrapper[4681]: I1007 17:31:52.467561 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" podStartSLOduration=2.310910937 podStartE2EDuration="2.467546627s" podCreationTimestamp="2025-10-07 17:31:50 +0000 UTC" firstStartedPulling="2025-10-07 17:31:51.416609098 +0000 UTC m=+1715.064020653" lastFinishedPulling="2025-10-07 17:31:51.573244788 +0000 UTC m=+1715.220656343" observedRunningTime="2025-10-07 17:31:52.463418912 +0000 UTC m=+1716.110830487" watchObservedRunningTime="2025-10-07 17:31:52.467546627 +0000 UTC m=+1716.114958182" Oct 07 17:31:53 crc kubenswrapper[4681]: I1007 17:31:53.509605 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:53 crc kubenswrapper[4681]: I1007 17:31:53.509817 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:53 crc kubenswrapper[4681]: I1007 17:31:53.567995 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:54 crc kubenswrapper[4681]: I1007 17:31:54.512246 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:54 crc kubenswrapper[4681]: I1007 17:31:54.561765 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:56 crc kubenswrapper[4681]: I1007 17:31:56.031910 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ht4mq"] Oct 07 17:31:56 crc kubenswrapper[4681]: I1007 17:31:56.049001 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ht4mq"] Oct 07 17:31:56 crc kubenswrapper[4681]: I1007 17:31:56.479948 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4d9mv" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="registry-server" containerID="cri-o://7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398" gracePeriod=2 Oct 07 17:31:56 crc kubenswrapper[4681]: I1007 17:31:56.916588 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.041414 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6347ab39-7d52-4a04-ac0a-3df98268b8fe" path="/var/lib/kubelet/pods/6347ab39-7d52-4a04-ac0a-3df98268b8fe/volumes" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.098941 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsw5f\" (UniqueName: \"kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f\") pod \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.099186 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities\") pod \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.099352 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content\") pod \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\" (UID: \"621ea6d0-a0ae-4d6c-be2f-1b0178224e43\") " Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.100362 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities" (OuterVolumeSpecName: "utilities") pod "621ea6d0-a0ae-4d6c-be2f-1b0178224e43" (UID: "621ea6d0-a0ae-4d6c-be2f-1b0178224e43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.114184 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f" (OuterVolumeSpecName: "kube-api-access-fsw5f") pod "621ea6d0-a0ae-4d6c-be2f-1b0178224e43" (UID: "621ea6d0-a0ae-4d6c-be2f-1b0178224e43"). InnerVolumeSpecName "kube-api-access-fsw5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.123687 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "621ea6d0-a0ae-4d6c-be2f-1b0178224e43" (UID: "621ea6d0-a0ae-4d6c-be2f-1b0178224e43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.202428 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.202459 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.202494 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsw5f\" (UniqueName: \"kubernetes.io/projected/621ea6d0-a0ae-4d6c-be2f-1b0178224e43-kube-api-access-fsw5f\") on node \"crc\" DevicePath \"\"" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.488814 4681 generic.go:334] "Generic (PLEG): container finished" podID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerID="7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398" exitCode=0 Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.488857 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerDied","Data":"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398"} Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.488884 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4d9mv" event={"ID":"621ea6d0-a0ae-4d6c-be2f-1b0178224e43","Type":"ContainerDied","Data":"9c1c15dd3df8aea1f8bcbe7fbc562ed18c0d604b400ab7f7f6a5f71129696a14"} Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.488893 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4d9mv" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.488965 4681 scope.go:117] "RemoveContainer" containerID="7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.520907 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.527444 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4d9mv"] Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.532641 4681 scope.go:117] "RemoveContainer" containerID="2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.553530 4681 scope.go:117] "RemoveContainer" containerID="409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.595674 4681 scope.go:117] "RemoveContainer" containerID="7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398" Oct 07 17:31:57 crc kubenswrapper[4681]: E1007 17:31:57.596187 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398\": container with ID starting with 7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398 not found: ID does not exist" containerID="7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.596217 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398"} err="failed to get container status \"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398\": rpc error: code = NotFound desc = could not find container \"7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398\": container with ID starting with 7208f0b037b23174c87d3dc0a73d23f2cf1d515e13fc602f64bbb0aff0a15398 not found: ID does not exist" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.596238 4681 scope.go:117] "RemoveContainer" containerID="2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41" Oct 07 17:31:57 crc kubenswrapper[4681]: E1007 17:31:57.596559 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41\": container with ID starting with 2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41 not found: ID does not exist" containerID="2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.596581 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41"} err="failed to get container status \"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41\": rpc error: code = NotFound desc = could not find container \"2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41\": container with ID starting with 2d5a7cd0c5528f778ec8dd355867554bbbee75a767c6a61426ca1f2116dbfe41 not found: ID does not exist" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.596593 4681 scope.go:117] "RemoveContainer" containerID="409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0" Oct 07 17:31:57 crc kubenswrapper[4681]: E1007 17:31:57.596859 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0\": container with ID starting with 409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0 not found: ID does not exist" containerID="409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0" Oct 07 17:31:57 crc kubenswrapper[4681]: I1007 17:31:57.596902 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0"} err="failed to get container status \"409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0\": rpc error: code = NotFound desc = could not find container \"409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0\": container with ID starting with 409b7f525115663f635e7c029499d2c1ce9fe42e6c340665b0bc1ff9382469c0 not found: ID does not exist" Oct 07 17:31:59 crc kubenswrapper[4681]: I1007 17:31:59.040739 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" path="/var/lib/kubelet/pods/621ea6d0-a0ae-4d6c-be2f-1b0178224e43/volumes" Oct 07 17:32:05 crc kubenswrapper[4681]: I1007 17:32:05.029196 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:32:05 crc kubenswrapper[4681]: E1007 17:32:05.030061 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:32:14 crc kubenswrapper[4681]: I1007 17:32:14.034524 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7418-account-create-2nwdq"] Oct 07 17:32:14 crc kubenswrapper[4681]: I1007 17:32:14.041755 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-234c-account-create-qhmzc"] Oct 07 17:32:14 crc kubenswrapper[4681]: I1007 17:32:14.048625 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-234c-account-create-qhmzc"] Oct 07 17:32:14 crc kubenswrapper[4681]: I1007 17:32:14.058148 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7418-account-create-2nwdq"] Oct 07 17:32:15 crc kubenswrapper[4681]: I1007 17:32:15.041925 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05edf8ae-0f11-4fb5-8441-6400e4d49ec1" path="/var/lib/kubelet/pods/05edf8ae-0f11-4fb5-8441-6400e4d49ec1/volumes" Oct 07 17:32:15 crc kubenswrapper[4681]: I1007 17:32:15.045721 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ce9368-1192-42cf-bd0d-e0f5a208ea77" path="/var/lib/kubelet/pods/45ce9368-1192-42cf-bd0d-e0f5a208ea77/volumes" Oct 07 17:32:16 crc kubenswrapper[4681]: I1007 17:32:16.033185 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pgdhp"] Oct 07 17:32:16 crc kubenswrapper[4681]: I1007 17:32:16.041095 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pgdhp"] Oct 07 17:32:16 crc kubenswrapper[4681]: I1007 17:32:16.048120 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-009d-account-create-cwtbh"] Oct 07 17:32:16 crc kubenswrapper[4681]: I1007 17:32:16.057192 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-009d-account-create-cwtbh"] Oct 07 17:32:17 crc kubenswrapper[4681]: I1007 17:32:17.041258 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd19541-0a38-4bab-bc65-ac2700770ce1" path="/var/lib/kubelet/pods/2fd19541-0a38-4bab-bc65-ac2700770ce1/volumes" Oct 07 17:32:17 crc kubenswrapper[4681]: I1007 17:32:17.042156 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f485ed7-a13a-4cee-b4ef-8df4e9659394" path="/var/lib/kubelet/pods/7f485ed7-a13a-4cee-b4ef-8df4e9659394/volumes" Oct 07 17:32:20 crc kubenswrapper[4681]: I1007 17:32:20.030182 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:32:20 crc kubenswrapper[4681]: E1007 17:32:20.031046 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:32:23 crc kubenswrapper[4681]: I1007 17:32:23.047196 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-m2s62"] Oct 07 17:32:23 crc kubenswrapper[4681]: I1007 17:32:23.047677 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-m2s62"] Oct 07 17:32:25 crc kubenswrapper[4681]: I1007 17:32:25.041624 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb73657-7045-4536-b856-81fcc6da6718" path="/var/lib/kubelet/pods/fdb73657-7045-4536-b856-81fcc6da6718/volumes" Oct 07 17:32:27 crc kubenswrapper[4681]: I1007 17:32:27.026604 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k2qcd"] Oct 07 17:32:27 crc kubenswrapper[4681]: I1007 17:32:27.044054 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k2qcd"] Oct 07 17:32:29 crc kubenswrapper[4681]: I1007 17:32:29.040767 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9405f877-b9a6-4d64-92f1-df500e73046f" path="/var/lib/kubelet/pods/9405f877-b9a6-4d64-92f1-df500e73046f/volumes" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.785636 4681 scope.go:117] "RemoveContainer" containerID="5580871f26d9f7dcc73fee042cbdd5c2de607e9b709a34fd42a3aa1cdc023ad1" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.820170 4681 scope.go:117] "RemoveContainer" containerID="cde73141f00eca0b7472798a820892d64bf39990390b19068b59b3b5b62b7e0e" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.867485 4681 scope.go:117] "RemoveContainer" containerID="4b81540b31ab1ead44d3eb742ea3865a8a49c299ce9949336faf0b5850aa3429" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.901283 4681 scope.go:117] "RemoveContainer" containerID="c6c43115eaace82fb286e9fd45284f16d87955e26ead779d3870669b14e55187" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.938702 4681 scope.go:117] "RemoveContainer" containerID="5120d5b99de7792408773647009766d11598ab9469f32c5e7c9ca7fea3cc167a" Oct 07 17:32:34 crc kubenswrapper[4681]: I1007 17:32:34.978728 4681 scope.go:117] "RemoveContainer" containerID="7584cfc8bbc92faa2c408828d41ef6e185457dda821cda29b7f3fc7981bdb801" Oct 07 17:32:35 crc kubenswrapper[4681]: I1007 17:32:35.028183 4681 scope.go:117] "RemoveContainer" containerID="d1dba4d4bb0773cc6a80a0a008e4e7301ebf9bef5c0fd52a2ecadeb06f3d90e6" Oct 07 17:32:35 crc kubenswrapper[4681]: I1007 17:32:35.032641 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:32:35 crc kubenswrapper[4681]: E1007 17:32:35.033099 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:32:35 crc kubenswrapper[4681]: I1007 17:32:35.051224 4681 scope.go:117] "RemoveContainer" containerID="522b4b57e4561b39de92c9d31fcf44554d94c5eac7b1ee302818010040107912" Oct 07 17:32:35 crc kubenswrapper[4681]: I1007 17:32:35.069896 4681 scope.go:117] "RemoveContainer" containerID="36747c666167852514cd58d2dc0e07155503cdc9059811705518c93231ea5007" Oct 07 17:32:35 crc kubenswrapper[4681]: I1007 17:32:35.095226 4681 scope.go:117] "RemoveContainer" containerID="f9a3bc8cc397dec5ed9a2ce1f025c44880cc5cbaccbb83ab609e58cbe58b6282" Oct 07 17:32:50 crc kubenswrapper[4681]: I1007 17:32:50.029241 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:32:50 crc kubenswrapper[4681]: E1007 17:32:50.029958 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:33:05 crc kubenswrapper[4681]: I1007 17:33:05.029498 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:33:05 crc kubenswrapper[4681]: E1007 17:33:05.030242 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:33:19 crc kubenswrapper[4681]: I1007 17:33:19.029110 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:33:19 crc kubenswrapper[4681]: E1007 17:33:19.029779 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:33:25 crc kubenswrapper[4681]: I1007 17:33:25.047994 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lvsnj"] Oct 07 17:33:25 crc kubenswrapper[4681]: I1007 17:33:25.057095 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lvsnj"] Oct 07 17:33:27 crc kubenswrapper[4681]: I1007 17:33:27.050929 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6" path="/var/lib/kubelet/pods/98d2a8b1-c8a8-4bc1-a9db-545f75abfcc6/volumes" Oct 07 17:33:31 crc kubenswrapper[4681]: I1007 17:33:31.030199 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:33:31 crc kubenswrapper[4681]: E1007 17:33:31.031476 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:33:33 crc kubenswrapper[4681]: I1007 17:33:33.042487 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ccnch"] Oct 07 17:33:33 crc kubenswrapper[4681]: I1007 17:33:33.042522 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ccnch"] Oct 07 17:33:35 crc kubenswrapper[4681]: I1007 17:33:35.039111 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236dd612-86c8-413b-8ec4-c0f2a55fbf9a" path="/var/lib/kubelet/pods/236dd612-86c8-413b-8ec4-c0f2a55fbf9a/volumes" Oct 07 17:33:35 crc kubenswrapper[4681]: I1007 17:33:35.321402 4681 scope.go:117] "RemoveContainer" containerID="bd9eeaa1933fc5b81841139e6a9f5b6cde0d6a5f14c328f1c1c7a60d9d0d0f73" Oct 07 17:33:35 crc kubenswrapper[4681]: I1007 17:33:35.346279 4681 scope.go:117] "RemoveContainer" containerID="202467d9e5207e358cd051d20deafde7b062577ece5c24287b2327a5711be6c3" Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.052284 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nt64g"] Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.061946 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-kvv88"] Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.078799 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-d8nsb"] Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.087903 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nt64g"] Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.095910 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-kvv88"] Oct 07 17:33:39 crc kubenswrapper[4681]: I1007 17:33:39.104927 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-d8nsb"] Oct 07 17:33:41 crc kubenswrapper[4681]: I1007 17:33:41.038748 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c6b5bc-aeb1-47bb-995f-cf7d67007900" path="/var/lib/kubelet/pods/13c6b5bc-aeb1-47bb-995f-cf7d67007900/volumes" Oct 07 17:33:41 crc kubenswrapper[4681]: I1007 17:33:41.039672 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a8142b-ea3d-4907-8331-885c973462eb" path="/var/lib/kubelet/pods/57a8142b-ea3d-4907-8331-885c973462eb/volumes" Oct 07 17:33:41 crc kubenswrapper[4681]: I1007 17:33:41.040180 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85befb0e-1557-44bb-b783-f0ea67d38de9" path="/var/lib/kubelet/pods/85befb0e-1557-44bb-b783-f0ea67d38de9/volumes" Oct 07 17:33:44 crc kubenswrapper[4681]: I1007 17:33:44.029255 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:33:44 crc kubenswrapper[4681]: I1007 17:33:44.489042 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767"} Oct 07 17:33:52 crc kubenswrapper[4681]: I1007 17:33:52.030795 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x9lv2"] Oct 07 17:33:52 crc kubenswrapper[4681]: I1007 17:33:52.038351 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x9lv2"] Oct 07 17:33:53 crc kubenswrapper[4681]: I1007 17:33:53.039507 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53e8384-cd97-4cec-ae70-918f86112a99" path="/var/lib/kubelet/pods/a53e8384-cd97-4cec-ae70-918f86112a99/volumes" Oct 07 17:33:55 crc kubenswrapper[4681]: I1007 17:33:55.069618 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-afe2-account-create-vd26v"] Oct 07 17:33:55 crc kubenswrapper[4681]: I1007 17:33:55.077845 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1c2d-account-create-qp7z8"] Oct 07 17:33:55 crc kubenswrapper[4681]: I1007 17:33:55.086226 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1c2d-account-create-qp7z8"] Oct 07 17:33:55 crc kubenswrapper[4681]: I1007 17:33:55.094551 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-afe2-account-create-vd26v"] Oct 07 17:33:56 crc kubenswrapper[4681]: I1007 17:33:56.029924 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-03d8-account-create-8f5sj"] Oct 07 17:33:56 crc kubenswrapper[4681]: I1007 17:33:56.037112 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-03d8-account-create-8f5sj"] Oct 07 17:33:57 crc kubenswrapper[4681]: I1007 17:33:57.041279 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fb8b33-cbe4-46dd-83b0-d35325b63940" path="/var/lib/kubelet/pods/19fb8b33-cbe4-46dd-83b0-d35325b63940/volumes" Oct 07 17:33:57 crc kubenswrapper[4681]: I1007 17:33:57.043098 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682a62d1-b6b0-4f0d-9f94-ba1f48d92447" path="/var/lib/kubelet/pods/682a62d1-b6b0-4f0d-9f94-ba1f48d92447/volumes" Oct 07 17:33:57 crc kubenswrapper[4681]: I1007 17:33:57.044954 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39303f5-20d2-4d09-8033-70a8c3ad916b" path="/var/lib/kubelet/pods/f39303f5-20d2-4d09-8033-70a8c3ad916b/volumes" Oct 07 17:34:06 crc kubenswrapper[4681]: I1007 17:34:06.690981 4681 generic.go:334] "Generic (PLEG): container finished" podID="41bd87d5-77d6-4866-b9b8-aaed777393b5" containerID="2d6149edb43d977a12dc92bb6c789df5abd50e224c9e4ea3793519f92922f8bd" exitCode=0 Oct 07 17:34:06 crc kubenswrapper[4681]: I1007 17:34:06.691063 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" event={"ID":"41bd87d5-77d6-4866-b9b8-aaed777393b5","Type":"ContainerDied","Data":"2d6149edb43d977a12dc92bb6c789df5abd50e224c9e4ea3793519f92922f8bd"} Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.168141 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.263097 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key\") pod \"41bd87d5-77d6-4866-b9b8-aaed777393b5\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.263259 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory\") pod \"41bd87d5-77d6-4866-b9b8-aaed777393b5\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.263379 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6xxm\" (UniqueName: \"kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm\") pod \"41bd87d5-77d6-4866-b9b8-aaed777393b5\" (UID: \"41bd87d5-77d6-4866-b9b8-aaed777393b5\") " Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.272280 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm" (OuterVolumeSpecName: "kube-api-access-m6xxm") pod "41bd87d5-77d6-4866-b9b8-aaed777393b5" (UID: "41bd87d5-77d6-4866-b9b8-aaed777393b5"). InnerVolumeSpecName "kube-api-access-m6xxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.289539 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "41bd87d5-77d6-4866-b9b8-aaed777393b5" (UID: "41bd87d5-77d6-4866-b9b8-aaed777393b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.291242 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory" (OuterVolumeSpecName: "inventory") pod "41bd87d5-77d6-4866-b9b8-aaed777393b5" (UID: "41bd87d5-77d6-4866-b9b8-aaed777393b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.365802 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.365834 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41bd87d5-77d6-4866-b9b8-aaed777393b5-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.365843 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6xxm\" (UniqueName: \"kubernetes.io/projected/41bd87d5-77d6-4866-b9b8-aaed777393b5-kube-api-access-m6xxm\") on node \"crc\" DevicePath \"\"" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.709551 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" event={"ID":"41bd87d5-77d6-4866-b9b8-aaed777393b5","Type":"ContainerDied","Data":"a52ef6e74e7dbe516aebd9519c37c5ac3f5f179e7b444dcb271de42e698c72bb"} Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.709603 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52ef6e74e7dbe516aebd9519c37c5ac3f5f179e7b444dcb271de42e698c72bb" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.709604 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.806941 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp"] Oct 07 17:34:08 crc kubenswrapper[4681]: E1007 17:34:08.807351 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bd87d5-77d6-4866-b9b8-aaed777393b5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807370 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bd87d5-77d6-4866-b9b8-aaed777393b5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 17:34:08 crc kubenswrapper[4681]: E1007 17:34:08.807379 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="registry-server" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807385 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="registry-server" Oct 07 17:34:08 crc kubenswrapper[4681]: E1007 17:34:08.807397 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="extract-utilities" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807403 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="extract-utilities" Oct 07 17:34:08 crc kubenswrapper[4681]: E1007 17:34:08.807420 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="extract-content" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807425 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="extract-content" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807581 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bd87d5-77d6-4866-b9b8-aaed777393b5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.807610 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="621ea6d0-a0ae-4d6c-be2f-1b0178224e43" containerName="registry-server" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.808273 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.810488 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.810513 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.811689 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.812001 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.836050 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp"] Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.876465 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2bsc\" (UniqueName: \"kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.876583 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.876671 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.978293 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2bsc\" (UniqueName: \"kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.978408 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.978470 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.984862 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.985024 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:08 crc kubenswrapper[4681]: I1007 17:34:08.998914 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2bsc\" (UniqueName: \"kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:09 crc kubenswrapper[4681]: I1007 17:34:09.132098 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:34:09 crc kubenswrapper[4681]: W1007 17:34:09.655476 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ed5213_33ec_47cd_bc96_8d536fa86f61.slice/crio-629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574 WatchSource:0}: Error finding container 629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574: Status 404 returned error can't find the container with id 629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574 Oct 07 17:34:09 crc kubenswrapper[4681]: I1007 17:34:09.658854 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp"] Oct 07 17:34:09 crc kubenswrapper[4681]: I1007 17:34:09.718485 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" event={"ID":"44ed5213-33ec-47cd-bc96-8d536fa86f61","Type":"ContainerStarted","Data":"629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574"} Oct 07 17:34:10 crc kubenswrapper[4681]: I1007 17:34:10.736337 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" event={"ID":"44ed5213-33ec-47cd-bc96-8d536fa86f61","Type":"ContainerStarted","Data":"d0e77c41a152eca6e612caeb478fc9643349a8d83bfeafcee095331285327012"} Oct 07 17:34:33 crc kubenswrapper[4681]: I1007 17:34:33.047363 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" podStartSLOduration=24.860906123 podStartE2EDuration="25.047340905s" podCreationTimestamp="2025-10-07 17:34:08 +0000 UTC" firstStartedPulling="2025-10-07 17:34:09.65795384 +0000 UTC m=+1853.305365395" lastFinishedPulling="2025-10-07 17:34:09.844388622 +0000 UTC m=+1853.491800177" observedRunningTime="2025-10-07 17:34:10.759167203 +0000 UTC m=+1854.406578798" watchObservedRunningTime="2025-10-07 17:34:33.047340905 +0000 UTC m=+1876.694752470" Oct 07 17:34:33 crc kubenswrapper[4681]: I1007 17:34:33.048644 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-djn84"] Oct 07 17:34:33 crc kubenswrapper[4681]: I1007 17:34:33.058810 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-djn84"] Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.044397 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2ce7e8-5280-4cfa-b2b1-d680465cd889" path="/var/lib/kubelet/pods/be2ce7e8-5280-4cfa-b2b1-d680465cd889/volumes" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.453545 4681 scope.go:117] "RemoveContainer" containerID="9fd06254547cd22f2f74f6da45b04403bc5efd70122ed1f1c78216e4562ec195" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.474912 4681 scope.go:117] "RemoveContainer" containerID="93ad7b183b558a32291106a0e4c44a05a04f97e8c0c7b38613fdc80e793cce88" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.523584 4681 scope.go:117] "RemoveContainer" containerID="db30f3a7f2355011a879a9893bac5559f8ade79cfc795e4262ee88633bf5eb1c" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.588117 4681 scope.go:117] "RemoveContainer" containerID="5ad82b80c83d78e8f5d52a2d764c63675763c4c4c5cdbf323044015855117723" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.632107 4681 scope.go:117] "RemoveContainer" containerID="249da69380c06575044c5f8afa96025a6444ff4ecaf86de16d1de2e38b09aefe" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.671511 4681 scope.go:117] "RemoveContainer" containerID="650f29af75c677191b6adcab2341ac25de978d7148c459c417359d2622dc6b92" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.725658 4681 scope.go:117] "RemoveContainer" containerID="4b8c1401229f3aca69ec6a37493278b1b60a12bdf6125f4dfd244ed76351a276" Oct 07 17:34:35 crc kubenswrapper[4681]: I1007 17:34:35.749611 4681 scope.go:117] "RemoveContainer" containerID="6f5537b199e5b8bc513252240eec6cd21166d8c2bbe5e17d3d585a9fb98a93a7" Oct 07 17:35:00 crc kubenswrapper[4681]: I1007 17:35:00.035727 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bw9sz"] Oct 07 17:35:00 crc kubenswrapper[4681]: I1007 17:35:00.044559 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kzhqh"] Oct 07 17:35:00 crc kubenswrapper[4681]: I1007 17:35:00.052004 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kzhqh"] Oct 07 17:35:00 crc kubenswrapper[4681]: I1007 17:35:00.061434 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bw9sz"] Oct 07 17:35:01 crc kubenswrapper[4681]: I1007 17:35:01.039795 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d37201-5b36-4972-a378-7e20139e4731" path="/var/lib/kubelet/pods/42d37201-5b36-4972-a378-7e20139e4731/volumes" Oct 07 17:35:01 crc kubenswrapper[4681]: I1007 17:35:01.040463 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd85044-6828-4eb3-89df-b7efcd333c6a" path="/var/lib/kubelet/pods/bbd85044-6828-4eb3-89df-b7efcd333c6a/volumes" Oct 07 17:35:32 crc kubenswrapper[4681]: I1007 17:35:32.415645 4681 generic.go:334] "Generic (PLEG): container finished" podID="44ed5213-33ec-47cd-bc96-8d536fa86f61" containerID="d0e77c41a152eca6e612caeb478fc9643349a8d83bfeafcee095331285327012" exitCode=0 Oct 07 17:35:32 crc kubenswrapper[4681]: I1007 17:35:32.415715 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" event={"ID":"44ed5213-33ec-47cd-bc96-8d536fa86f61","Type":"ContainerDied","Data":"d0e77c41a152eca6e612caeb478fc9643349a8d83bfeafcee095331285327012"} Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.817664 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.939821 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2bsc\" (UniqueName: \"kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc\") pod \"44ed5213-33ec-47cd-bc96-8d536fa86f61\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.940454 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key\") pod \"44ed5213-33ec-47cd-bc96-8d536fa86f61\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.940687 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory\") pod \"44ed5213-33ec-47cd-bc96-8d536fa86f61\" (UID: \"44ed5213-33ec-47cd-bc96-8d536fa86f61\") " Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.947133 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc" (OuterVolumeSpecName: "kube-api-access-g2bsc") pod "44ed5213-33ec-47cd-bc96-8d536fa86f61" (UID: "44ed5213-33ec-47cd-bc96-8d536fa86f61"). InnerVolumeSpecName "kube-api-access-g2bsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.971426 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory" (OuterVolumeSpecName: "inventory") pod "44ed5213-33ec-47cd-bc96-8d536fa86f61" (UID: "44ed5213-33ec-47cd-bc96-8d536fa86f61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:35:33 crc kubenswrapper[4681]: I1007 17:35:33.975977 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44ed5213-33ec-47cd-bc96-8d536fa86f61" (UID: "44ed5213-33ec-47cd-bc96-8d536fa86f61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.042788 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.042822 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ed5213-33ec-47cd-bc96-8d536fa86f61-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.042834 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2bsc\" (UniqueName: \"kubernetes.io/projected/44ed5213-33ec-47cd-bc96-8d536fa86f61-kube-api-access-g2bsc\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.436600 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" event={"ID":"44ed5213-33ec-47cd-bc96-8d536fa86f61","Type":"ContainerDied","Data":"629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574"} Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.436670 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="629ffde804f46e8692d21a146a623e1d5bc8e5bcb19680c753f2c88aae4da574" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.436696 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.541720 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2"] Oct 07 17:35:34 crc kubenswrapper[4681]: E1007 17:35:34.542246 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ed5213-33ec-47cd-bc96-8d536fa86f61" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.542269 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ed5213-33ec-47cd-bc96-8d536fa86f61" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.542480 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ed5213-33ec-47cd-bc96-8d536fa86f61" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.543168 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.544941 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.545813 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.546126 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.550485 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2"] Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.555482 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.654233 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.654613 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.654724 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrtl6\" (UniqueName: \"kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.756569 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.756734 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrtl6\" (UniqueName: \"kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.756763 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.760225 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.762641 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.776725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrtl6\" (UniqueName: \"kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-49wz2\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:34 crc kubenswrapper[4681]: I1007 17:35:34.884511 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:35 crc kubenswrapper[4681]: I1007 17:35:35.398835 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2"] Oct 07 17:35:35 crc kubenswrapper[4681]: W1007 17:35:35.415287 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ee9809_6f86_44fb_9b11_163437e7750e.slice/crio-3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59 WatchSource:0}: Error finding container 3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59: Status 404 returned error can't find the container with id 3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59 Oct 07 17:35:35 crc kubenswrapper[4681]: I1007 17:35:35.446620 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" event={"ID":"d3ee9809-6f86-44fb-9b11-163437e7750e","Type":"ContainerStarted","Data":"3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59"} Oct 07 17:35:35 crc kubenswrapper[4681]: I1007 17:35:35.927562 4681 scope.go:117] "RemoveContainer" containerID="fd4a6b3e938118ec75880ea0e489dff7d2fe5084b9dd35cd3a002f7db3f74d1c" Oct 07 17:35:35 crc kubenswrapper[4681]: I1007 17:35:35.974150 4681 scope.go:117] "RemoveContainer" containerID="473b56e8b1ae81160a230176445922e3c0bd2a0220f9cd450022e93dd99fb60b" Oct 07 17:35:36 crc kubenswrapper[4681]: I1007 17:35:36.458163 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" event={"ID":"d3ee9809-6f86-44fb-9b11-163437e7750e","Type":"ContainerStarted","Data":"33587cad13c90ff67c49973959f612b32cfed1efe9632296311ca789bd322c1a"} Oct 07 17:35:36 crc kubenswrapper[4681]: I1007 17:35:36.480132 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" podStartSLOduration=2.334603365 podStartE2EDuration="2.480113165s" podCreationTimestamp="2025-10-07 17:35:34 +0000 UTC" firstStartedPulling="2025-10-07 17:35:35.423127616 +0000 UTC m=+1939.070539171" lastFinishedPulling="2025-10-07 17:35:35.568637416 +0000 UTC m=+1939.216048971" observedRunningTime="2025-10-07 17:35:36.47815995 +0000 UTC m=+1940.125571515" watchObservedRunningTime="2025-10-07 17:35:36.480113165 +0000 UTC m=+1940.127524720" Oct 07 17:35:41 crc kubenswrapper[4681]: I1007 17:35:41.494765 4681 generic.go:334] "Generic (PLEG): container finished" podID="d3ee9809-6f86-44fb-9b11-163437e7750e" containerID="33587cad13c90ff67c49973959f612b32cfed1efe9632296311ca789bd322c1a" exitCode=0 Oct 07 17:35:41 crc kubenswrapper[4681]: I1007 17:35:41.495393 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" event={"ID":"d3ee9809-6f86-44fb-9b11-163437e7750e","Type":"ContainerDied","Data":"33587cad13c90ff67c49973959f612b32cfed1efe9632296311ca789bd322c1a"} Oct 07 17:35:42 crc kubenswrapper[4681]: I1007 17:35:42.041594 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fsn5z"] Oct 07 17:35:42 crc kubenswrapper[4681]: I1007 17:35:42.052256 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fsn5z"] Oct 07 17:35:42 crc kubenswrapper[4681]: I1007 17:35:42.892099 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.041197 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d57f2b-808f-4924-806d-b88ea028039b" path="/var/lib/kubelet/pods/31d57f2b-808f-4924-806d-b88ea028039b/volumes" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.041912 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key\") pod \"d3ee9809-6f86-44fb-9b11-163437e7750e\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.042060 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrtl6\" (UniqueName: \"kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6\") pod \"d3ee9809-6f86-44fb-9b11-163437e7750e\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.042097 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory\") pod \"d3ee9809-6f86-44fb-9b11-163437e7750e\" (UID: \"d3ee9809-6f86-44fb-9b11-163437e7750e\") " Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.051702 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6" (OuterVolumeSpecName: "kube-api-access-wrtl6") pod "d3ee9809-6f86-44fb-9b11-163437e7750e" (UID: "d3ee9809-6f86-44fb-9b11-163437e7750e"). InnerVolumeSpecName "kube-api-access-wrtl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.069378 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3ee9809-6f86-44fb-9b11-163437e7750e" (UID: "d3ee9809-6f86-44fb-9b11-163437e7750e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.080638 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory" (OuterVolumeSpecName: "inventory") pod "d3ee9809-6f86-44fb-9b11-163437e7750e" (UID: "d3ee9809-6f86-44fb-9b11-163437e7750e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.144642 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrtl6\" (UniqueName: \"kubernetes.io/projected/d3ee9809-6f86-44fb-9b11-163437e7750e-kube-api-access-wrtl6\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.144682 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.144695 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3ee9809-6f86-44fb-9b11-163437e7750e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.510174 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" event={"ID":"d3ee9809-6f86-44fb-9b11-163437e7750e","Type":"ContainerDied","Data":"3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59"} Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.510214 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3166fe9bc8d2a8e624cb30560892b679f900b571e9f6626c34af56ae3b3d5c59" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.510266 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-49wz2" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.581993 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf"] Oct 07 17:35:43 crc kubenswrapper[4681]: E1007 17:35:43.582376 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ee9809-6f86-44fb-9b11-163437e7750e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.582394 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ee9809-6f86-44fb-9b11-163437e7750e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.582589 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ee9809-6f86-44fb-9b11-163437e7750e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.583272 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.585559 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.587015 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.591025 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf"] Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.593193 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.593367 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.754829 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.755153 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.755293 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dcr\" (UniqueName: \"kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.856999 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.857076 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dcr\" (UniqueName: \"kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.857183 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.864765 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.866193 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.880059 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dcr\" (UniqueName: \"kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-9qxmf\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:43 crc kubenswrapper[4681]: I1007 17:35:43.910135 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:35:44 crc kubenswrapper[4681]: I1007 17:35:44.448826 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf"] Oct 07 17:35:44 crc kubenswrapper[4681]: I1007 17:35:44.519111 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" event={"ID":"51176e78-6a59-4fe2-abc5-88a3177b9ee0","Type":"ContainerStarted","Data":"530285c9389e2dbcf3aba74993251879e711aac2129e303c4cfd43d65c1ee118"} Oct 07 17:35:45 crc kubenswrapper[4681]: I1007 17:35:45.531598 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" event={"ID":"51176e78-6a59-4fe2-abc5-88a3177b9ee0","Type":"ContainerStarted","Data":"1e00312e1616473fb97d41bf53f5cfb4b127e5dc9fd1c0e210a7104c3dde32da"} Oct 07 17:35:45 crc kubenswrapper[4681]: I1007 17:35:45.550909 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" podStartSLOduration=2.409495819 podStartE2EDuration="2.550881894s" podCreationTimestamp="2025-10-07 17:35:43 +0000 UTC" firstStartedPulling="2025-10-07 17:35:44.45212776 +0000 UTC m=+1948.099539315" lastFinishedPulling="2025-10-07 17:35:44.593513835 +0000 UTC m=+1948.240925390" observedRunningTime="2025-10-07 17:35:45.546226114 +0000 UTC m=+1949.193637669" watchObservedRunningTime="2025-10-07 17:35:45.550881894 +0000 UTC m=+1949.198293449" Oct 07 17:36:12 crc kubenswrapper[4681]: I1007 17:36:12.195627 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:36:12 crc kubenswrapper[4681]: I1007 17:36:12.196196 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:36:28 crc kubenswrapper[4681]: I1007 17:36:28.915333 4681 generic.go:334] "Generic (PLEG): container finished" podID="51176e78-6a59-4fe2-abc5-88a3177b9ee0" containerID="1e00312e1616473fb97d41bf53f5cfb4b127e5dc9fd1c0e210a7104c3dde32da" exitCode=0 Oct 07 17:36:28 crc kubenswrapper[4681]: I1007 17:36:28.915704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" event={"ID":"51176e78-6a59-4fe2-abc5-88a3177b9ee0","Type":"ContainerDied","Data":"1e00312e1616473fb97d41bf53f5cfb4b127e5dc9fd1c0e210a7104c3dde32da"} Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.440249 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.599065 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96dcr\" (UniqueName: \"kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr\") pod \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.599147 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key\") pod \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.599275 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory\") pod \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\" (UID: \"51176e78-6a59-4fe2-abc5-88a3177b9ee0\") " Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.605059 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr" (OuterVolumeSpecName: "kube-api-access-96dcr") pod "51176e78-6a59-4fe2-abc5-88a3177b9ee0" (UID: "51176e78-6a59-4fe2-abc5-88a3177b9ee0"). InnerVolumeSpecName "kube-api-access-96dcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.629261 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory" (OuterVolumeSpecName: "inventory") pod "51176e78-6a59-4fe2-abc5-88a3177b9ee0" (UID: "51176e78-6a59-4fe2-abc5-88a3177b9ee0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.631020 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "51176e78-6a59-4fe2-abc5-88a3177b9ee0" (UID: "51176e78-6a59-4fe2-abc5-88a3177b9ee0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.701231 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.701262 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96dcr\" (UniqueName: \"kubernetes.io/projected/51176e78-6a59-4fe2-abc5-88a3177b9ee0-kube-api-access-96dcr\") on node \"crc\" DevicePath \"\"" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.701273 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51176e78-6a59-4fe2-abc5-88a3177b9ee0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.939452 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" event={"ID":"51176e78-6a59-4fe2-abc5-88a3177b9ee0","Type":"ContainerDied","Data":"530285c9389e2dbcf3aba74993251879e711aac2129e303c4cfd43d65c1ee118"} Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.939490 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="530285c9389e2dbcf3aba74993251879e711aac2129e303c4cfd43d65c1ee118" Oct 07 17:36:30 crc kubenswrapper[4681]: I1007 17:36:30.939502 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-9qxmf" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.023697 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm"] Oct 07 17:36:31 crc kubenswrapper[4681]: E1007 17:36:31.024191 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51176e78-6a59-4fe2-abc5-88a3177b9ee0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.024215 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="51176e78-6a59-4fe2-abc5-88a3177b9ee0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.024429 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="51176e78-6a59-4fe2-abc5-88a3177b9ee0" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.025243 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.027573 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.027819 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.027964 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.028086 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.041348 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm"] Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.109122 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.109165 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdhh\" (UniqueName: \"kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.109237 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.210638 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.210684 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdhh\" (UniqueName: \"kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.210797 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.214563 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.217379 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.245184 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdhh\" (UniqueName: \"kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jktwm\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.352434 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.883960 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm"] Oct 07 17:36:31 crc kubenswrapper[4681]: I1007 17:36:31.947318 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" event={"ID":"13708146-56fd-426d-988d-d6e66d01cadb","Type":"ContainerStarted","Data":"4ec950ce1bea876da07d44536a39d04b568c820e23e3863eccb2c765cfd38567"} Oct 07 17:36:32 crc kubenswrapper[4681]: I1007 17:36:32.956554 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" event={"ID":"13708146-56fd-426d-988d-d6e66d01cadb","Type":"ContainerStarted","Data":"fef46961b2cba761ebbc606f3f649d8076a9d76f88759d33cac2dba8f7019358"} Oct 07 17:36:32 crc kubenswrapper[4681]: I1007 17:36:32.980064 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" podStartSLOduration=1.8053881349999998 podStartE2EDuration="1.980042187s" podCreationTimestamp="2025-10-07 17:36:31 +0000 UTC" firstStartedPulling="2025-10-07 17:36:31.894671738 +0000 UTC m=+1995.542083303" lastFinishedPulling="2025-10-07 17:36:32.0693258 +0000 UTC m=+1995.716737355" observedRunningTime="2025-10-07 17:36:32.97475182 +0000 UTC m=+1996.622163385" watchObservedRunningTime="2025-10-07 17:36:32.980042187 +0000 UTC m=+1996.627453752" Oct 07 17:36:36 crc kubenswrapper[4681]: I1007 17:36:36.038191 4681 scope.go:117] "RemoveContainer" containerID="a9a5c7dbc52eee72cedd56d516f7a7fd2ba3f37ec1281ad0681a0aea2bf8bd0f" Oct 07 17:36:42 crc kubenswrapper[4681]: I1007 17:36:42.194724 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:36:42 crc kubenswrapper[4681]: I1007 17:36:42.195206 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:37:12 crc kubenswrapper[4681]: I1007 17:37:12.195409 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:37:12 crc kubenswrapper[4681]: I1007 17:37:12.196000 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:37:12 crc kubenswrapper[4681]: I1007 17:37:12.196047 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:37:12 crc kubenswrapper[4681]: I1007 17:37:12.196856 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:37:12 crc kubenswrapper[4681]: I1007 17:37:12.196944 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767" gracePeriod=600 Oct 07 17:37:13 crc kubenswrapper[4681]: I1007 17:37:13.290654 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767" exitCode=0 Oct 07 17:37:13 crc kubenswrapper[4681]: I1007 17:37:13.290895 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767"} Oct 07 17:37:13 crc kubenswrapper[4681]: I1007 17:37:13.290968 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11"} Oct 07 17:37:13 crc kubenswrapper[4681]: I1007 17:37:13.290993 4681 scope.go:117] "RemoveContainer" containerID="a5f51e358e22f659934019078b03ff9b4d39e2e7befaec64153f9026c2ff36c0" Oct 07 17:37:32 crc kubenswrapper[4681]: I1007 17:37:32.436299 4681 generic.go:334] "Generic (PLEG): container finished" podID="13708146-56fd-426d-988d-d6e66d01cadb" containerID="fef46961b2cba761ebbc606f3f649d8076a9d76f88759d33cac2dba8f7019358" exitCode=2 Oct 07 17:37:32 crc kubenswrapper[4681]: I1007 17:37:32.436357 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" event={"ID":"13708146-56fd-426d-988d-d6e66d01cadb","Type":"ContainerDied","Data":"fef46961b2cba761ebbc606f3f649d8076a9d76f88759d33cac2dba8f7019358"} Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.815142 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.941266 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key\") pod \"13708146-56fd-426d-988d-d6e66d01cadb\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.941441 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdhh\" (UniqueName: \"kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh\") pod \"13708146-56fd-426d-988d-d6e66d01cadb\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.941462 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory\") pod \"13708146-56fd-426d-988d-d6e66d01cadb\" (UID: \"13708146-56fd-426d-988d-d6e66d01cadb\") " Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.948223 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh" (OuterVolumeSpecName: "kube-api-access-2zdhh") pod "13708146-56fd-426d-988d-d6e66d01cadb" (UID: "13708146-56fd-426d-988d-d6e66d01cadb"). InnerVolumeSpecName "kube-api-access-2zdhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.976922 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory" (OuterVolumeSpecName: "inventory") pod "13708146-56fd-426d-988d-d6e66d01cadb" (UID: "13708146-56fd-426d-988d-d6e66d01cadb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:37:33 crc kubenswrapper[4681]: I1007 17:37:33.978346 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13708146-56fd-426d-988d-d6e66d01cadb" (UID: "13708146-56fd-426d-988d-d6e66d01cadb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.056082 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdhh\" (UniqueName: \"kubernetes.io/projected/13708146-56fd-426d-988d-d6e66d01cadb-kube-api-access-2zdhh\") on node \"crc\" DevicePath \"\"" Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.056127 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.056139 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13708146-56fd-426d-988d-d6e66d01cadb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.454529 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" event={"ID":"13708146-56fd-426d-988d-d6e66d01cadb","Type":"ContainerDied","Data":"4ec950ce1bea876da07d44536a39d04b568c820e23e3863eccb2c765cfd38567"} Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.454846 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec950ce1bea876da07d44536a39d04b568c820e23e3863eccb2c765cfd38567" Oct 07 17:37:34 crc kubenswrapper[4681]: I1007 17:37:34.454664 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jktwm" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.048561 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h"] Oct 07 17:37:41 crc kubenswrapper[4681]: E1007 17:37:41.050027 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13708146-56fd-426d-988d-d6e66d01cadb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.050055 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="13708146-56fd-426d-988d-d6e66d01cadb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.050470 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="13708146-56fd-426d-988d-d6e66d01cadb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.051696 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h"] Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.051857 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.069446 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.069652 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.069947 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.070571 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.105531 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.105647 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48vzq\" (UniqueName: \"kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.105703 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.207533 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.207667 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48vzq\" (UniqueName: \"kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.207740 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.216584 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.227680 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.236132 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48vzq\" (UniqueName: \"kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-d494h\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.386122 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.942092 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h"] Oct 07 17:37:41 crc kubenswrapper[4681]: I1007 17:37:41.959363 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:37:42 crc kubenswrapper[4681]: I1007 17:37:42.522701 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" event={"ID":"fea47565-ef99-4b31-869a-075d2d8331e9","Type":"ContainerStarted","Data":"9504377313c1ff3912c06536fb93855a43f4c622f3a880dd2a189b249031b9d3"} Oct 07 17:37:42 crc kubenswrapper[4681]: I1007 17:37:42.523035 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" event={"ID":"fea47565-ef99-4b31-869a-075d2d8331e9","Type":"ContainerStarted","Data":"6035345f52bb75d293a74eca184e87e3e2ad025bcc2fa6d35254c19f9c4ef1fa"} Oct 07 17:37:42 crc kubenswrapper[4681]: I1007 17:37:42.543245 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" podStartSLOduration=1.398923079 podStartE2EDuration="1.543228269s" podCreationTimestamp="2025-10-07 17:37:41 +0000 UTC" firstStartedPulling="2025-10-07 17:37:41.959122079 +0000 UTC m=+2065.606533634" lastFinishedPulling="2025-10-07 17:37:42.103427269 +0000 UTC m=+2065.750838824" observedRunningTime="2025-10-07 17:37:42.537387497 +0000 UTC m=+2066.184799052" watchObservedRunningTime="2025-10-07 17:37:42.543228269 +0000 UTC m=+2066.190639824" Oct 07 17:37:50 crc kubenswrapper[4681]: I1007 17:37:50.993282 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:37:50 crc kubenswrapper[4681]: I1007 17:37:50.995570 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.011924 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.088541 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.088620 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.088650 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgpd\" (UniqueName: \"kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.190672 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.190767 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgpd\" (UniqueName: \"kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.191010 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.191304 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.192160 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.215692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgpd\" (UniqueName: \"kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd\") pod \"community-operators-hw2fz\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.318735 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.652150 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.697449 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.724820 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.812065 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.812306 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln5vf\" (UniqueName: \"kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.812407 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.913915 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.914026 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln5vf\" (UniqueName: \"kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.914097 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.914495 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.914702 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:51 crc kubenswrapper[4681]: I1007 17:37:51.943511 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln5vf\" (UniqueName: \"kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf\") pod \"certified-operators-gd8nz\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.045068 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.047187 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.568932 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.705213 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerStarted","Data":"c1a7d364988edec8b4881e596bdf87cf94bb3002fb78d53a0a2911137ecd7e7f"} Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.710375 4681 generic.go:334] "Generic (PLEG): container finished" podID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerID="a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1" exitCode=0 Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.710441 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerDied","Data":"a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1"} Oct 07 17:37:52 crc kubenswrapper[4681]: I1007 17:37:52.710491 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerStarted","Data":"6e58019f697e0630270168b43ac961834f8df7e1d18d55450657ebe648c8f2f8"} Oct 07 17:37:53 crc kubenswrapper[4681]: I1007 17:37:53.723621 4681 generic.go:334] "Generic (PLEG): container finished" podID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerID="acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623" exitCode=0 Oct 07 17:37:53 crc kubenswrapper[4681]: I1007 17:37:53.723747 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerDied","Data":"acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623"} Oct 07 17:37:53 crc kubenswrapper[4681]: I1007 17:37:53.727502 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerStarted","Data":"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b"} Oct 07 17:37:55 crc kubenswrapper[4681]: I1007 17:37:55.746140 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerStarted","Data":"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6"} Oct 07 17:37:55 crc kubenswrapper[4681]: I1007 17:37:55.748330 4681 generic.go:334] "Generic (PLEG): container finished" podID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerID="1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b" exitCode=0 Oct 07 17:37:55 crc kubenswrapper[4681]: I1007 17:37:55.748386 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerDied","Data":"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b"} Oct 07 17:37:57 crc kubenswrapper[4681]: I1007 17:37:57.771338 4681 generic.go:334] "Generic (PLEG): container finished" podID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerID="bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6" exitCode=0 Oct 07 17:37:57 crc kubenswrapper[4681]: I1007 17:37:57.771371 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerDied","Data":"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6"} Oct 07 17:37:57 crc kubenswrapper[4681]: I1007 17:37:57.775782 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerStarted","Data":"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85"} Oct 07 17:37:57 crc kubenswrapper[4681]: I1007 17:37:57.839436 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hw2fz" podStartSLOduration=3.6819953930000002 podStartE2EDuration="7.839418069s" podCreationTimestamp="2025-10-07 17:37:50 +0000 UTC" firstStartedPulling="2025-10-07 17:37:52.713106212 +0000 UTC m=+2076.360517757" lastFinishedPulling="2025-10-07 17:37:56.870528878 +0000 UTC m=+2080.517940433" observedRunningTime="2025-10-07 17:37:57.835157891 +0000 UTC m=+2081.482569446" watchObservedRunningTime="2025-10-07 17:37:57.839418069 +0000 UTC m=+2081.486829624" Oct 07 17:37:58 crc kubenswrapper[4681]: I1007 17:37:58.784194 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerStarted","Data":"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211"} Oct 07 17:37:58 crc kubenswrapper[4681]: I1007 17:37:58.814024 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gd8nz" podStartSLOduration=3.36855125 podStartE2EDuration="7.813998118s" podCreationTimestamp="2025-10-07 17:37:51 +0000 UTC" firstStartedPulling="2025-10-07 17:37:53.725915574 +0000 UTC m=+2077.373327149" lastFinishedPulling="2025-10-07 17:37:58.171362462 +0000 UTC m=+2081.818774017" observedRunningTime="2025-10-07 17:37:58.801205393 +0000 UTC m=+2082.448616958" watchObservedRunningTime="2025-10-07 17:37:58.813998118 +0000 UTC m=+2082.461409683" Oct 07 17:38:01 crc kubenswrapper[4681]: I1007 17:38:01.319595 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:01 crc kubenswrapper[4681]: I1007 17:38:01.319940 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:02 crc kubenswrapper[4681]: I1007 17:38:02.045934 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:02 crc kubenswrapper[4681]: I1007 17:38:02.045975 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:02 crc kubenswrapper[4681]: I1007 17:38:02.363501 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hw2fz" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="registry-server" probeResult="failure" output=< Oct 07 17:38:02 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:38:02 crc kubenswrapper[4681]: > Oct 07 17:38:03 crc kubenswrapper[4681]: I1007 17:38:03.089349 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gd8nz" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="registry-server" probeResult="failure" output=< Oct 07 17:38:03 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:38:03 crc kubenswrapper[4681]: > Oct 07 17:38:11 crc kubenswrapper[4681]: I1007 17:38:11.380481 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:11 crc kubenswrapper[4681]: I1007 17:38:11.434646 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:11 crc kubenswrapper[4681]: I1007 17:38:11.613924 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:38:12 crc kubenswrapper[4681]: I1007 17:38:12.088610 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:12 crc kubenswrapper[4681]: I1007 17:38:12.135597 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:12 crc kubenswrapper[4681]: I1007 17:38:12.913305 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hw2fz" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="registry-server" containerID="cri-o://a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85" gracePeriod=2 Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.381014 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.539120 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content\") pod \"4d8fbe23-c90b-4866-a733-a86051caf14b\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.539302 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities\") pod \"4d8fbe23-c90b-4866-a733-a86051caf14b\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.539396 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjgpd\" (UniqueName: \"kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd\") pod \"4d8fbe23-c90b-4866-a733-a86051caf14b\" (UID: \"4d8fbe23-c90b-4866-a733-a86051caf14b\") " Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.540008 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities" (OuterVolumeSpecName: "utilities") pod "4d8fbe23-c90b-4866-a733-a86051caf14b" (UID: "4d8fbe23-c90b-4866-a733-a86051caf14b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.553228 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd" (OuterVolumeSpecName: "kube-api-access-vjgpd") pod "4d8fbe23-c90b-4866-a733-a86051caf14b" (UID: "4d8fbe23-c90b-4866-a733-a86051caf14b"). InnerVolumeSpecName "kube-api-access-vjgpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.582353 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d8fbe23-c90b-4866-a733-a86051caf14b" (UID: "4d8fbe23-c90b-4866-a733-a86051caf14b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.641979 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjgpd\" (UniqueName: \"kubernetes.io/projected/4d8fbe23-c90b-4866-a733-a86051caf14b-kube-api-access-vjgpd\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.642268 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.642348 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d8fbe23-c90b-4866-a733-a86051caf14b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.937243 4681 generic.go:334] "Generic (PLEG): container finished" podID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerID="a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85" exitCode=0 Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.937443 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerDied","Data":"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85"} Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.938177 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hw2fz" event={"ID":"4d8fbe23-c90b-4866-a733-a86051caf14b","Type":"ContainerDied","Data":"6e58019f697e0630270168b43ac961834f8df7e1d18d55450657ebe648c8f2f8"} Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.938274 4681 scope.go:117] "RemoveContainer" containerID="a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.937472 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hw2fz" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.986936 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.987954 4681 scope.go:117] "RemoveContainer" containerID="1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b" Oct 07 17:38:13 crc kubenswrapper[4681]: I1007 17:38:13.994278 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hw2fz"] Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.023091 4681 scope.go:117] "RemoveContainer" containerID="a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.071398 4681 scope.go:117] "RemoveContainer" containerID="a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85" Oct 07 17:38:14 crc kubenswrapper[4681]: E1007 17:38:14.076057 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85\": container with ID starting with a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85 not found: ID does not exist" containerID="a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.076116 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85"} err="failed to get container status \"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85\": rpc error: code = NotFound desc = could not find container \"a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85\": container with ID starting with a7a62c42e1c170a1cc93074ce623c92115076ec58346d3e4c9e91392bdbc7e85 not found: ID does not exist" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.076143 4681 scope.go:117] "RemoveContainer" containerID="1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b" Oct 07 17:38:14 crc kubenswrapper[4681]: E1007 17:38:14.076683 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b\": container with ID starting with 1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b not found: ID does not exist" containerID="1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.076713 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b"} err="failed to get container status \"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b\": rpc error: code = NotFound desc = could not find container \"1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b\": container with ID starting with 1f3595f2d1d50dac8ea5465cb21ba906c708c2c9498fdf03d38c60e1b4d04a0b not found: ID does not exist" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.076732 4681 scope.go:117] "RemoveContainer" containerID="a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1" Oct 07 17:38:14 crc kubenswrapper[4681]: E1007 17:38:14.082054 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1\": container with ID starting with a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1 not found: ID does not exist" containerID="a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.082113 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1"} err="failed to get container status \"a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1\": rpc error: code = NotFound desc = could not find container \"a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1\": container with ID starting with a89bad5f70e261b36e75d6c20dff449087beb76a1b5934165902c9de99a692b1 not found: ID does not exist" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.417335 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.417993 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gd8nz" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="registry-server" containerID="cri-o://044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211" gracePeriod=2 Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.895096 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.961734 4681 generic.go:334] "Generic (PLEG): container finished" podID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerID="044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211" exitCode=0 Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.961771 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerDied","Data":"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211"} Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.961794 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gd8nz" event={"ID":"adbce2cd-bee0-4f15-8e80-534047ddd94e","Type":"ContainerDied","Data":"c1a7d364988edec8b4881e596bdf87cf94bb3002fb78d53a0a2911137ecd7e7f"} Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.961810 4681 scope.go:117] "RemoveContainer" containerID="044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.961920 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gd8nz" Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.969419 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln5vf\" (UniqueName: \"kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf\") pod \"adbce2cd-bee0-4f15-8e80-534047ddd94e\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " Oct 07 17:38:14 crc kubenswrapper[4681]: I1007 17:38:14.979227 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf" (OuterVolumeSpecName: "kube-api-access-ln5vf") pod "adbce2cd-bee0-4f15-8e80-534047ddd94e" (UID: "adbce2cd-bee0-4f15-8e80-534047ddd94e"). InnerVolumeSpecName "kube-api-access-ln5vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.005529 4681 scope.go:117] "RemoveContainer" containerID="bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.034718 4681 scope.go:117] "RemoveContainer" containerID="acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.039818 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" path="/var/lib/kubelet/pods/4d8fbe23-c90b-4866-a733-a86051caf14b/volumes" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.051637 4681 scope.go:117] "RemoveContainer" containerID="044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211" Oct 07 17:38:15 crc kubenswrapper[4681]: E1007 17:38:15.052034 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211\": container with ID starting with 044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211 not found: ID does not exist" containerID="044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.052062 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211"} err="failed to get container status \"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211\": rpc error: code = NotFound desc = could not find container \"044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211\": container with ID starting with 044230421ed0b97a1d2799f171d72696541a0bc5b40957025033252b4985a211 not found: ID does not exist" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.052083 4681 scope.go:117] "RemoveContainer" containerID="bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6" Oct 07 17:38:15 crc kubenswrapper[4681]: E1007 17:38:15.052401 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6\": container with ID starting with bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6 not found: ID does not exist" containerID="bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.052420 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6"} err="failed to get container status \"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6\": rpc error: code = NotFound desc = could not find container \"bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6\": container with ID starting with bb3ed2af03a5e807b9a83210666a4adac55d6247a94b86111adcc8161b8035f6 not found: ID does not exist" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.052433 4681 scope.go:117] "RemoveContainer" containerID="acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623" Oct 07 17:38:15 crc kubenswrapper[4681]: E1007 17:38:15.052653 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623\": container with ID starting with acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623 not found: ID does not exist" containerID="acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.052676 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623"} err="failed to get container status \"acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623\": rpc error: code = NotFound desc = could not find container \"acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623\": container with ID starting with acca44d980f48782dd65b2f53ad74761dd18c29171c88d673c3a8c4c0f773623 not found: ID does not exist" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.070904 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content\") pod \"adbce2cd-bee0-4f15-8e80-534047ddd94e\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.071283 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities\") pod \"adbce2cd-bee0-4f15-8e80-534047ddd94e\" (UID: \"adbce2cd-bee0-4f15-8e80-534047ddd94e\") " Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.071828 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln5vf\" (UniqueName: \"kubernetes.io/projected/adbce2cd-bee0-4f15-8e80-534047ddd94e-kube-api-access-ln5vf\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.074203 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities" (OuterVolumeSpecName: "utilities") pod "adbce2cd-bee0-4f15-8e80-534047ddd94e" (UID: "adbce2cd-bee0-4f15-8e80-534047ddd94e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.112919 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adbce2cd-bee0-4f15-8e80-534047ddd94e" (UID: "adbce2cd-bee0-4f15-8e80-534047ddd94e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.173764 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.173799 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbce2cd-bee0-4f15-8e80-534047ddd94e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.290204 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:38:15 crc kubenswrapper[4681]: I1007 17:38:15.301293 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gd8nz"] Oct 07 17:38:17 crc kubenswrapper[4681]: I1007 17:38:17.040393 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" path="/var/lib/kubelet/pods/adbce2cd-bee0-4f15-8e80-534047ddd94e/volumes" Oct 07 17:38:38 crc kubenswrapper[4681]: I1007 17:38:38.158920 4681 generic.go:334] "Generic (PLEG): container finished" podID="fea47565-ef99-4b31-869a-075d2d8331e9" containerID="9504377313c1ff3912c06536fb93855a43f4c622f3a880dd2a189b249031b9d3" exitCode=0 Oct 07 17:38:38 crc kubenswrapper[4681]: I1007 17:38:38.158987 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" event={"ID":"fea47565-ef99-4b31-869a-075d2d8331e9","Type":"ContainerDied","Data":"9504377313c1ff3912c06536fb93855a43f4c622f3a880dd2a189b249031b9d3"} Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.575711 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.716782 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48vzq\" (UniqueName: \"kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq\") pod \"fea47565-ef99-4b31-869a-075d2d8331e9\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.717065 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory\") pod \"fea47565-ef99-4b31-869a-075d2d8331e9\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.717191 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key\") pod \"fea47565-ef99-4b31-869a-075d2d8331e9\" (UID: \"fea47565-ef99-4b31-869a-075d2d8331e9\") " Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.723677 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq" (OuterVolumeSpecName: "kube-api-access-48vzq") pod "fea47565-ef99-4b31-869a-075d2d8331e9" (UID: "fea47565-ef99-4b31-869a-075d2d8331e9"). InnerVolumeSpecName "kube-api-access-48vzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.742086 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fea47565-ef99-4b31-869a-075d2d8331e9" (UID: "fea47565-ef99-4b31-869a-075d2d8331e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.762864 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory" (OuterVolumeSpecName: "inventory") pod "fea47565-ef99-4b31-869a-075d2d8331e9" (UID: "fea47565-ef99-4b31-869a-075d2d8331e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.819182 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48vzq\" (UniqueName: \"kubernetes.io/projected/fea47565-ef99-4b31-869a-075d2d8331e9-kube-api-access-48vzq\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.819417 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:39 crc kubenswrapper[4681]: I1007 17:38:39.819533 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fea47565-ef99-4b31-869a-075d2d8331e9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.176341 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" event={"ID":"fea47565-ef99-4b31-869a-075d2d8331e9","Type":"ContainerDied","Data":"6035345f52bb75d293a74eca184e87e3e2ad025bcc2fa6d35254c19f9c4ef1fa"} Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.176389 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6035345f52bb75d293a74eca184e87e3e2ad025bcc2fa6d35254c19f9c4ef1fa" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.176785 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-d494h" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268327 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qlsmk"] Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268783 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="extract-utilities" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268799 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="extract-utilities" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268810 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268817 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268827 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="extract-content" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268835 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="extract-content" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268860 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268866 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268906 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="extract-content" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268915 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="extract-content" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268931 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea47565-ef99-4b31-869a-075d2d8331e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268939 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea47565-ef99-4b31-869a-075d2d8331e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:38:40 crc kubenswrapper[4681]: E1007 17:38:40.268952 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="extract-utilities" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.268959 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="extract-utilities" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.269151 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbce2cd-bee0-4f15-8e80-534047ddd94e" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.269173 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8fbe23-c90b-4866-a733-a86051caf14b" containerName="registry-server" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.269190 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea47565-ef99-4b31-869a-075d2d8331e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.269756 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.272857 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.272992 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.273071 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.273116 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.285331 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qlsmk"] Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.431703 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.431788 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.431810 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22d7\" (UniqueName: \"kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.532873 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.532972 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22d7\" (UniqueName: \"kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.532991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.544068 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.544310 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.552956 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22d7\" (UniqueName: \"kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7\") pod \"ssh-known-hosts-edpm-deployment-qlsmk\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:40 crc kubenswrapper[4681]: I1007 17:38:40.586526 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:41 crc kubenswrapper[4681]: I1007 17:38:41.082730 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qlsmk"] Oct 07 17:38:41 crc kubenswrapper[4681]: I1007 17:38:41.184071 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" event={"ID":"4022d381-ce12-4c86-9368-4089026a66d3","Type":"ContainerStarted","Data":"364508b4b0b783265b7d00ed477abc91015455a9c1fea0a8a9c5b272519995ac"} Oct 07 17:38:42 crc kubenswrapper[4681]: I1007 17:38:42.192753 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" event={"ID":"4022d381-ce12-4c86-9368-4089026a66d3","Type":"ContainerStarted","Data":"8f7816c809b9a6901c61da22978160f332e9c0a197d4f80bcebdef5ac7365e07"} Oct 07 17:38:42 crc kubenswrapper[4681]: I1007 17:38:42.217159 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" podStartSLOduration=2.081557627 podStartE2EDuration="2.217142304s" podCreationTimestamp="2025-10-07 17:38:40 +0000 UTC" firstStartedPulling="2025-10-07 17:38:41.088511564 +0000 UTC m=+2124.735923119" lastFinishedPulling="2025-10-07 17:38:41.224096251 +0000 UTC m=+2124.871507796" observedRunningTime="2025-10-07 17:38:42.21271624 +0000 UTC m=+2125.860127795" watchObservedRunningTime="2025-10-07 17:38:42.217142304 +0000 UTC m=+2125.864553859" Oct 07 17:38:49 crc kubenswrapper[4681]: I1007 17:38:49.247296 4681 generic.go:334] "Generic (PLEG): container finished" podID="4022d381-ce12-4c86-9368-4089026a66d3" containerID="8f7816c809b9a6901c61da22978160f332e9c0a197d4f80bcebdef5ac7365e07" exitCode=0 Oct 07 17:38:49 crc kubenswrapper[4681]: I1007 17:38:49.248048 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" event={"ID":"4022d381-ce12-4c86-9368-4089026a66d3","Type":"ContainerDied","Data":"8f7816c809b9a6901c61da22978160f332e9c0a197d4f80bcebdef5ac7365e07"} Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.676467 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.726268 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0\") pod \"4022d381-ce12-4c86-9368-4089026a66d3\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.726338 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam\") pod \"4022d381-ce12-4c86-9368-4089026a66d3\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.726472 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22d7\" (UniqueName: \"kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7\") pod \"4022d381-ce12-4c86-9368-4089026a66d3\" (UID: \"4022d381-ce12-4c86-9368-4089026a66d3\") " Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.733104 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7" (OuterVolumeSpecName: "kube-api-access-s22d7") pod "4022d381-ce12-4c86-9368-4089026a66d3" (UID: "4022d381-ce12-4c86-9368-4089026a66d3"). InnerVolumeSpecName "kube-api-access-s22d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.761040 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4022d381-ce12-4c86-9368-4089026a66d3" (UID: "4022d381-ce12-4c86-9368-4089026a66d3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.766352 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4022d381-ce12-4c86-9368-4089026a66d3" (UID: "4022d381-ce12-4c86-9368-4089026a66d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.829232 4681 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.829269 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4022d381-ce12-4c86-9368-4089026a66d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:50 crc kubenswrapper[4681]: I1007 17:38:50.829287 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s22d7\" (UniqueName: \"kubernetes.io/projected/4022d381-ce12-4c86-9368-4089026a66d3-kube-api-access-s22d7\") on node \"crc\" DevicePath \"\"" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.265676 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" event={"ID":"4022d381-ce12-4c86-9368-4089026a66d3","Type":"ContainerDied","Data":"364508b4b0b783265b7d00ed477abc91015455a9c1fea0a8a9c5b272519995ac"} Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.266139 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364508b4b0b783265b7d00ed477abc91015455a9c1fea0a8a9c5b272519995ac" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.265721 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qlsmk" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.425566 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758"] Oct 07 17:38:51 crc kubenswrapper[4681]: E1007 17:38:51.426010 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4022d381-ce12-4c86-9368-4089026a66d3" containerName="ssh-known-hosts-edpm-deployment" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.426031 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="4022d381-ce12-4c86-9368-4089026a66d3" containerName="ssh-known-hosts-edpm-deployment" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.426215 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="4022d381-ce12-4c86-9368-4089026a66d3" containerName="ssh-known-hosts-edpm-deployment" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.426908 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.431080 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.431158 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.431418 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.433656 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.437635 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758"] Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.439691 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.439780 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kgh\" (UniqueName: \"kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.439809 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.541809 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kgh\" (UniqueName: \"kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.542496 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.542679 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.546614 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.546929 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.559838 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kgh\" (UniqueName: \"kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rp758\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:51 crc kubenswrapper[4681]: I1007 17:38:51.811333 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:38:52 crc kubenswrapper[4681]: I1007 17:38:52.347701 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758"] Oct 07 17:38:52 crc kubenswrapper[4681]: W1007 17:38:52.360989 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda2c6bbc_c4e1_4767_8815_fbc4cada002a.slice/crio-0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22 WatchSource:0}: Error finding container 0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22: Status 404 returned error can't find the container with id 0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22 Oct 07 17:38:53 crc kubenswrapper[4681]: I1007 17:38:53.291225 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" event={"ID":"da2c6bbc-c4e1-4767-8815-fbc4cada002a","Type":"ContainerStarted","Data":"11a68fb2dbc8883afbd610100dcbed60dd0fad5bf2cb89e4aa34c8af0debc9cf"} Oct 07 17:38:53 crc kubenswrapper[4681]: I1007 17:38:53.291487 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" event={"ID":"da2c6bbc-c4e1-4767-8815-fbc4cada002a","Type":"ContainerStarted","Data":"0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22"} Oct 07 17:38:53 crc kubenswrapper[4681]: I1007 17:38:53.313843 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" podStartSLOduration=2.167431662 podStartE2EDuration="2.313824029s" podCreationTimestamp="2025-10-07 17:38:51 +0000 UTC" firstStartedPulling="2025-10-07 17:38:52.368508453 +0000 UTC m=+2136.015920008" lastFinishedPulling="2025-10-07 17:38:52.51490082 +0000 UTC m=+2136.162312375" observedRunningTime="2025-10-07 17:38:53.312599115 +0000 UTC m=+2136.960010680" watchObservedRunningTime="2025-10-07 17:38:53.313824029 +0000 UTC m=+2136.961235584" Oct 07 17:39:01 crc kubenswrapper[4681]: I1007 17:39:01.366073 4681 generic.go:334] "Generic (PLEG): container finished" podID="da2c6bbc-c4e1-4767-8815-fbc4cada002a" containerID="11a68fb2dbc8883afbd610100dcbed60dd0fad5bf2cb89e4aa34c8af0debc9cf" exitCode=0 Oct 07 17:39:01 crc kubenswrapper[4681]: I1007 17:39:01.366522 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" event={"ID":"da2c6bbc-c4e1-4767-8815-fbc4cada002a","Type":"ContainerDied","Data":"11a68fb2dbc8883afbd610100dcbed60dd0fad5bf2cb89e4aa34c8af0debc9cf"} Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.746658 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.938286 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9kgh\" (UniqueName: \"kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh\") pod \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.938677 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key\") pod \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.939041 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory\") pod \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\" (UID: \"da2c6bbc-c4e1-4767-8815-fbc4cada002a\") " Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.944317 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh" (OuterVolumeSpecName: "kube-api-access-l9kgh") pod "da2c6bbc-c4e1-4767-8815-fbc4cada002a" (UID: "da2c6bbc-c4e1-4767-8815-fbc4cada002a"). InnerVolumeSpecName "kube-api-access-l9kgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.965551 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory" (OuterVolumeSpecName: "inventory") pod "da2c6bbc-c4e1-4767-8815-fbc4cada002a" (UID: "da2c6bbc-c4e1-4767-8815-fbc4cada002a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:39:02 crc kubenswrapper[4681]: I1007 17:39:02.975137 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da2c6bbc-c4e1-4767-8815-fbc4cada002a" (UID: "da2c6bbc-c4e1-4767-8815-fbc4cada002a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.042238 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9kgh\" (UniqueName: \"kubernetes.io/projected/da2c6bbc-c4e1-4767-8815-fbc4cada002a-kube-api-access-l9kgh\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.043055 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.043082 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2c6bbc-c4e1-4767-8815-fbc4cada002a-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.387521 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" event={"ID":"da2c6bbc-c4e1-4767-8815-fbc4cada002a","Type":"ContainerDied","Data":"0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22"} Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.387560 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e76b7b1f112571013ef6d19cab4a0830c599af576018bc11a5272d66cbb6d22" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.387570 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rp758" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.469704 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl"] Oct 07 17:39:03 crc kubenswrapper[4681]: E1007 17:39:03.470153 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2c6bbc-c4e1-4767-8815-fbc4cada002a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.470172 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2c6bbc-c4e1-4767-8815-fbc4cada002a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.470348 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2c6bbc-c4e1-4767-8815-fbc4cada002a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.470982 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.477273 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.477357 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.477372 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.477529 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.480089 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl"] Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.551589 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxch\" (UniqueName: \"kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.551847 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.551966 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.652964 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.653085 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxch\" (UniqueName: \"kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.653114 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.657042 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.658102 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.684152 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxch\" (UniqueName: \"kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:03 crc kubenswrapper[4681]: I1007 17:39:03.788720 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:04 crc kubenswrapper[4681]: I1007 17:39:04.309340 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl"] Oct 07 17:39:04 crc kubenswrapper[4681]: I1007 17:39:04.396205 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" event={"ID":"12f3d295-f471-4e4d-9884-3bf34dab377f","Type":"ContainerStarted","Data":"d4d29d779689d51702af33730e0e7143c9c3389cb0d1cc788e3ef76b5bed7280"} Oct 07 17:39:05 crc kubenswrapper[4681]: I1007 17:39:05.403674 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" event={"ID":"12f3d295-f471-4e4d-9884-3bf34dab377f","Type":"ContainerStarted","Data":"ec11cf260bbd22269c482ff5db0cad74b4f30c1782d43430236d6fab0a5450e8"} Oct 07 17:39:05 crc kubenswrapper[4681]: I1007 17:39:05.421494 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" podStartSLOduration=2.272644091 podStartE2EDuration="2.421471916s" podCreationTimestamp="2025-10-07 17:39:03 +0000 UTC" firstStartedPulling="2025-10-07 17:39:04.320267568 +0000 UTC m=+2147.967679123" lastFinishedPulling="2025-10-07 17:39:04.469095393 +0000 UTC m=+2148.116506948" observedRunningTime="2025-10-07 17:39:05.420617951 +0000 UTC m=+2149.068029546" watchObservedRunningTime="2025-10-07 17:39:05.421471916 +0000 UTC m=+2149.068883471" Oct 07 17:39:12 crc kubenswrapper[4681]: I1007 17:39:12.195258 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:39:12 crc kubenswrapper[4681]: I1007 17:39:12.195701 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:39:14 crc kubenswrapper[4681]: I1007 17:39:14.469331 4681 generic.go:334] "Generic (PLEG): container finished" podID="12f3d295-f471-4e4d-9884-3bf34dab377f" containerID="ec11cf260bbd22269c482ff5db0cad74b4f30c1782d43430236d6fab0a5450e8" exitCode=0 Oct 07 17:39:14 crc kubenswrapper[4681]: I1007 17:39:14.469522 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" event={"ID":"12f3d295-f471-4e4d-9884-3bf34dab377f","Type":"ContainerDied","Data":"ec11cf260bbd22269c482ff5db0cad74b4f30c1782d43430236d6fab0a5450e8"} Oct 07 17:39:15 crc kubenswrapper[4681]: I1007 17:39:15.856028 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:15 crc kubenswrapper[4681]: I1007 17:39:15.982910 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key\") pod \"12f3d295-f471-4e4d-9884-3bf34dab377f\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " Oct 07 17:39:15 crc kubenswrapper[4681]: I1007 17:39:15.983079 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxch\" (UniqueName: \"kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch\") pod \"12f3d295-f471-4e4d-9884-3bf34dab377f\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " Oct 07 17:39:15 crc kubenswrapper[4681]: I1007 17:39:15.983229 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory\") pod \"12f3d295-f471-4e4d-9884-3bf34dab377f\" (UID: \"12f3d295-f471-4e4d-9884-3bf34dab377f\") " Oct 07 17:39:15 crc kubenswrapper[4681]: I1007 17:39:15.988134 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch" (OuterVolumeSpecName: "kube-api-access-psxch") pod "12f3d295-f471-4e4d-9884-3bf34dab377f" (UID: "12f3d295-f471-4e4d-9884-3bf34dab377f"). InnerVolumeSpecName "kube-api-access-psxch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.008582 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12f3d295-f471-4e4d-9884-3bf34dab377f" (UID: "12f3d295-f471-4e4d-9884-3bf34dab377f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.011142 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory" (OuterVolumeSpecName: "inventory") pod "12f3d295-f471-4e4d-9884-3bf34dab377f" (UID: "12f3d295-f471-4e4d-9884-3bf34dab377f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.085681 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxch\" (UniqueName: \"kubernetes.io/projected/12f3d295-f471-4e4d-9884-3bf34dab377f-kube-api-access-psxch\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.085719 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.085728 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12f3d295-f471-4e4d-9884-3bf34dab377f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.486265 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" event={"ID":"12f3d295-f471-4e4d-9884-3bf34dab377f","Type":"ContainerDied","Data":"d4d29d779689d51702af33730e0e7143c9c3389cb0d1cc788e3ef76b5bed7280"} Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.486301 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d29d779689d51702af33730e0e7143c9c3389cb0d1cc788e3ef76b5bed7280" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.486334 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.575341 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg"] Oct 07 17:39:16 crc kubenswrapper[4681]: E1007 17:39:16.575763 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f3d295-f471-4e4d-9884-3bf34dab377f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.575785 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f3d295-f471-4e4d-9884-3bf34dab377f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.576047 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f3d295-f471-4e4d-9884-3bf34dab377f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.576771 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.580487 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.580582 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.581471 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.581966 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.587383 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg"] Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.611134 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.611480 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.612163 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.612496 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.714783 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.714828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.714862 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.714919 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.714945 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715675 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715708 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715736 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715765 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715806 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715837 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715867 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715925 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z5s\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.715977 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.818392 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.818451 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.818522 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.818554 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.818592 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819526 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819568 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z5s\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819621 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819661 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819687 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819731 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819793 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.819837 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.828380 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.828776 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.829042 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.832192 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.832481 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.833061 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.833946 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.833983 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.834238 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.836825 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.839034 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.839692 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.840946 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.844306 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z5s\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:16 crc kubenswrapper[4681]: I1007 17:39:16.923200 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:39:17 crc kubenswrapper[4681]: I1007 17:39:17.510994 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg"] Oct 07 17:39:17 crc kubenswrapper[4681]: I1007 17:39:17.674659 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:39:18 crc kubenswrapper[4681]: I1007 17:39:18.504808 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" event={"ID":"07a9584a-a546-4ec3-ba13-1f0db8c3ba39","Type":"ContainerStarted","Data":"661434dc86441c9e584389ddf8e83bbaa61615abfbfeb1f3f490f038d14a6fbf"} Oct 07 17:39:18 crc kubenswrapper[4681]: I1007 17:39:18.505378 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" event={"ID":"07a9584a-a546-4ec3-ba13-1f0db8c3ba39","Type":"ContainerStarted","Data":"09cdcc3b586131bbf7a3f8f4152a92f706e7f3ed78a0d9b96fab537d77188343"} Oct 07 17:39:18 crc kubenswrapper[4681]: I1007 17:39:18.532822 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" podStartSLOduration=2.38955258 podStartE2EDuration="2.53280716s" podCreationTimestamp="2025-10-07 17:39:16 +0000 UTC" firstStartedPulling="2025-10-07 17:39:17.525855632 +0000 UTC m=+2161.173267187" lastFinishedPulling="2025-10-07 17:39:17.669110212 +0000 UTC m=+2161.316521767" observedRunningTime="2025-10-07 17:39:18.532060549 +0000 UTC m=+2162.179472104" watchObservedRunningTime="2025-10-07 17:39:18.53280716 +0000 UTC m=+2162.180218705" Oct 07 17:39:42 crc kubenswrapper[4681]: I1007 17:39:42.195906 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:39:42 crc kubenswrapper[4681]: I1007 17:39:42.196487 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:40:01 crc kubenswrapper[4681]: I1007 17:40:01.849829 4681 generic.go:334] "Generic (PLEG): container finished" podID="07a9584a-a546-4ec3-ba13-1f0db8c3ba39" containerID="661434dc86441c9e584389ddf8e83bbaa61615abfbfeb1f3f490f038d14a6fbf" exitCode=0 Oct 07 17:40:01 crc kubenswrapper[4681]: I1007 17:40:01.850394 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" event={"ID":"07a9584a-a546-4ec3-ba13-1f0db8c3ba39","Type":"ContainerDied","Data":"661434dc86441c9e584389ddf8e83bbaa61615abfbfeb1f3f490f038d14a6fbf"} Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.255285 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447732 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447792 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447831 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447855 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447927 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.447957 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448010 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448066 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z5s\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448108 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448141 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448192 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448239 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448292 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.448352 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle\") pod \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\" (UID: \"07a9584a-a546-4ec3-ba13-1f0db8c3ba39\") " Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.453731 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.454030 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.455819 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.456849 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.457404 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.457449 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s" (OuterVolumeSpecName: "kube-api-access-m4z5s") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "kube-api-access-m4z5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.458243 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.458348 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.458731 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.463121 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.466027 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.467000 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.483837 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory" (OuterVolumeSpecName: "inventory") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.487752 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "07a9584a-a546-4ec3-ba13-1f0db8c3ba39" (UID: "07a9584a-a546-4ec3-ba13-1f0db8c3ba39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550367 4681 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550399 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550409 4681 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550418 4681 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550431 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550439 4681 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550448 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550459 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z5s\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-kube-api-access-m4z5s\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550467 4681 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550476 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550485 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550493 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550505 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.550514 4681 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a9584a-a546-4ec3-ba13-1f0db8c3ba39-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.867597 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" event={"ID":"07a9584a-a546-4ec3-ba13-1f0db8c3ba39","Type":"ContainerDied","Data":"09cdcc3b586131bbf7a3f8f4152a92f706e7f3ed78a0d9b96fab537d77188343"} Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.867639 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09cdcc3b586131bbf7a3f8f4152a92f706e7f3ed78a0d9b96fab537d77188343" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.867703 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.977543 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz"] Oct 07 17:40:03 crc kubenswrapper[4681]: E1007 17:40:03.978456 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a9584a-a546-4ec3-ba13-1f0db8c3ba39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.978481 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a9584a-a546-4ec3-ba13-1f0db8c3ba39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.978924 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a9584a-a546-4ec3-ba13-1f0db8c3ba39" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.979852 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.981918 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.981981 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.981981 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.982545 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.982922 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:40:03 crc kubenswrapper[4681]: I1007 17:40:03.994361 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz"] Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.057935 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.058267 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlq64\" (UniqueName: \"kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.058505 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.058674 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.058846 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.160630 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlq64\" (UniqueName: \"kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.161033 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.161198 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.161369 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.161552 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.161947 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.164713 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.166148 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.168644 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.180318 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlq64\" (UniqueName: \"kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wd6mz\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.297396 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.777106 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz"] Oct 07 17:40:04 crc kubenswrapper[4681]: I1007 17:40:04.877839 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" event={"ID":"eb92dd00-8b97-470f-9f2c-3ff1ee783f93","Type":"ContainerStarted","Data":"a0c3e8a6427ccdfe0962d29f0b704ee40e6ad4737f3afe66228adcae3ef2bc4c"} Oct 07 17:40:05 crc kubenswrapper[4681]: I1007 17:40:05.897216 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" event={"ID":"eb92dd00-8b97-470f-9f2c-3ff1ee783f93","Type":"ContainerStarted","Data":"380ca56f5132d69f9cfd570a8de9c7a0779591aecc0bb6eb62ac7f9cf94dbdda"} Oct 07 17:40:05 crc kubenswrapper[4681]: I1007 17:40:05.927693 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" podStartSLOduration=2.751373907 podStartE2EDuration="2.927672665s" podCreationTimestamp="2025-10-07 17:40:03 +0000 UTC" firstStartedPulling="2025-10-07 17:40:04.786782916 +0000 UTC m=+2208.434194471" lastFinishedPulling="2025-10-07 17:40:04.963081674 +0000 UTC m=+2208.610493229" observedRunningTime="2025-10-07 17:40:05.914003635 +0000 UTC m=+2209.561415210" watchObservedRunningTime="2025-10-07 17:40:05.927672665 +0000 UTC m=+2209.575084220" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.195351 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.195852 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.195925 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.196573 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.196617 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" gracePeriod=600 Oct 07 17:40:12 crc kubenswrapper[4681]: E1007 17:40:12.321755 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.950299 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" exitCode=0 Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.950341 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11"} Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.950372 4681 scope.go:117] "RemoveContainer" containerID="ad4b0c013eb4193912c036e13ab105cfb4c4e355d6478ef69c7f9e2f52056767" Oct 07 17:40:12 crc kubenswrapper[4681]: I1007 17:40:12.951098 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:40:12 crc kubenswrapper[4681]: E1007 17:40:12.951599 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:40:24 crc kubenswrapper[4681]: I1007 17:40:24.029094 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:40:24 crc kubenswrapper[4681]: E1007 17:40:24.029802 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:40:35 crc kubenswrapper[4681]: I1007 17:40:35.029912 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:40:35 crc kubenswrapper[4681]: E1007 17:40:35.030675 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:40:46 crc kubenswrapper[4681]: I1007 17:40:46.029347 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:40:46 crc kubenswrapper[4681]: E1007 17:40:46.030435 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:40:59 crc kubenswrapper[4681]: I1007 17:40:59.030583 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:40:59 crc kubenswrapper[4681]: E1007 17:40:59.032812 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:41:11 crc kubenswrapper[4681]: I1007 17:41:11.030423 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:41:11 crc kubenswrapper[4681]: E1007 17:41:11.031305 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:41:17 crc kubenswrapper[4681]: I1007 17:41:17.516133 4681 generic.go:334] "Generic (PLEG): container finished" podID="eb92dd00-8b97-470f-9f2c-3ff1ee783f93" containerID="380ca56f5132d69f9cfd570a8de9c7a0779591aecc0bb6eb62ac7f9cf94dbdda" exitCode=0 Oct 07 17:41:17 crc kubenswrapper[4681]: I1007 17:41:17.516750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" event={"ID":"eb92dd00-8b97-470f-9f2c-3ff1ee783f93","Type":"ContainerDied","Data":"380ca56f5132d69f9cfd570a8de9c7a0779591aecc0bb6eb62ac7f9cf94dbdda"} Oct 07 17:41:18 crc kubenswrapper[4681]: I1007 17:41:18.903187 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.002679 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlq64\" (UniqueName: \"kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64\") pod \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.002745 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key\") pod \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.002828 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory\") pod \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.002935 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle\") pod \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.003006 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0\") pod \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\" (UID: \"eb92dd00-8b97-470f-9f2c-3ff1ee783f93\") " Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.009547 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64" (OuterVolumeSpecName: "kube-api-access-hlq64") pod "eb92dd00-8b97-470f-9f2c-3ff1ee783f93" (UID: "eb92dd00-8b97-470f-9f2c-3ff1ee783f93"). InnerVolumeSpecName "kube-api-access-hlq64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.010966 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eb92dd00-8b97-470f-9f2c-3ff1ee783f93" (UID: "eb92dd00-8b97-470f-9f2c-3ff1ee783f93"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.034016 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb92dd00-8b97-470f-9f2c-3ff1ee783f93" (UID: "eb92dd00-8b97-470f-9f2c-3ff1ee783f93"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.042193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory" (OuterVolumeSpecName: "inventory") pod "eb92dd00-8b97-470f-9f2c-3ff1ee783f93" (UID: "eb92dd00-8b97-470f-9f2c-3ff1ee783f93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.046404 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "eb92dd00-8b97-470f-9f2c-3ff1ee783f93" (UID: "eb92dd00-8b97-470f-9f2c-3ff1ee783f93"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.105427 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.105468 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.105481 4681 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.105493 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlq64\" (UniqueName: \"kubernetes.io/projected/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-kube-api-access-hlq64\") on node \"crc\" DevicePath \"\"" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.105504 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb92dd00-8b97-470f-9f2c-3ff1ee783f93-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.533894 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" event={"ID":"eb92dd00-8b97-470f-9f2c-3ff1ee783f93","Type":"ContainerDied","Data":"a0c3e8a6427ccdfe0962d29f0b704ee40e6ad4737f3afe66228adcae3ef2bc4c"} Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.534221 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c3e8a6427ccdfe0962d29f0b704ee40e6ad4737f3afe66228adcae3ef2bc4c" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.533960 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wd6mz" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.636821 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw"] Oct 07 17:41:19 crc kubenswrapper[4681]: E1007 17:41:19.637288 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb92dd00-8b97-470f-9f2c-3ff1ee783f93" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.637311 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb92dd00-8b97-470f-9f2c-3ff1ee783f93" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.637540 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb92dd00-8b97-470f-9f2c-3ff1ee783f93" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.638241 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.640393 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.640802 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.646351 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.646366 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.646520 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.648388 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.650235 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw"] Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.819124 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.819528 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.819566 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.819734 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.819971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.820007 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6p7v\" (UniqueName: \"kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.921777 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.921826 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.921893 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.921978 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.922003 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6p7v\" (UniqueName: \"kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.922061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.927105 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.927978 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.928707 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.930586 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.935584 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.938049 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6p7v\" (UniqueName: \"kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:19 crc kubenswrapper[4681]: I1007 17:41:19.959538 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:41:20 crc kubenswrapper[4681]: I1007 17:41:20.464082 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw"] Oct 07 17:41:20 crc kubenswrapper[4681]: I1007 17:41:20.542582 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" event={"ID":"d1f9c32e-011c-49a9-8319-4aeb852fa976","Type":"ContainerStarted","Data":"773c1c94856d46aab8fe678085eec616f6803bd32192b19397f3e7325131667c"} Oct 07 17:41:21 crc kubenswrapper[4681]: I1007 17:41:21.552240 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" event={"ID":"d1f9c32e-011c-49a9-8319-4aeb852fa976","Type":"ContainerStarted","Data":"20db18cc4e25328c8da4af47ef65a83f2d45d980c8a0fb58c176c451d946283e"} Oct 07 17:41:21 crc kubenswrapper[4681]: I1007 17:41:21.575925 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" podStartSLOduration=2.373580884 podStartE2EDuration="2.575906365s" podCreationTimestamp="2025-10-07 17:41:19 +0000 UTC" firstStartedPulling="2025-10-07 17:41:20.475214069 +0000 UTC m=+2284.122625634" lastFinishedPulling="2025-10-07 17:41:20.67753955 +0000 UTC m=+2284.324951115" observedRunningTime="2025-10-07 17:41:21.572332498 +0000 UTC m=+2285.219744063" watchObservedRunningTime="2025-10-07 17:41:21.575906365 +0000 UTC m=+2285.223317920" Oct 07 17:41:25 crc kubenswrapper[4681]: I1007 17:41:25.029372 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:41:25 crc kubenswrapper[4681]: E1007 17:41:25.029939 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:41:37 crc kubenswrapper[4681]: I1007 17:41:37.033773 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:41:37 crc kubenswrapper[4681]: E1007 17:41:37.034438 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:41:51 crc kubenswrapper[4681]: I1007 17:41:51.029043 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:41:51 crc kubenswrapper[4681]: E1007 17:41:51.029907 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.259420 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.262839 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.273465 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.356210 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.356280 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.356336 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7d65\" (UniqueName: \"kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.458148 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.458237 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.458282 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7d65\" (UniqueName: \"kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.458766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.459023 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.487069 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7d65\" (UniqueName: \"kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65\") pod \"redhat-marketplace-9qwkx\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:00 crc kubenswrapper[4681]: I1007 17:42:00.587151 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:01 crc kubenswrapper[4681]: I1007 17:42:01.074694 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:01 crc kubenswrapper[4681]: I1007 17:42:01.939912 4681 generic.go:334] "Generic (PLEG): container finished" podID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerID="e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c" exitCode=0 Oct 07 17:42:01 crc kubenswrapper[4681]: I1007 17:42:01.940027 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerDied","Data":"e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c"} Oct 07 17:42:01 crc kubenswrapper[4681]: I1007 17:42:01.940404 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerStarted","Data":"07ffd1122608ca2dbb2df242422a30e0d3f005ba477d0e25c658e3dd6827e0f2"} Oct 07 17:42:03 crc kubenswrapper[4681]: I1007 17:42:03.958869 4681 generic.go:334] "Generic (PLEG): container finished" podID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerID="a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf" exitCode=0 Oct 07 17:42:03 crc kubenswrapper[4681]: I1007 17:42:03.958954 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerDied","Data":"a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf"} Oct 07 17:42:05 crc kubenswrapper[4681]: I1007 17:42:05.982139 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerStarted","Data":"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577"} Oct 07 17:42:06 crc kubenswrapper[4681]: I1007 17:42:06.003781 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qwkx" podStartSLOduration=3.104039378 podStartE2EDuration="6.00376023s" podCreationTimestamp="2025-10-07 17:42:00 +0000 UTC" firstStartedPulling="2025-10-07 17:42:01.941765815 +0000 UTC m=+2325.589177370" lastFinishedPulling="2025-10-07 17:42:04.841486657 +0000 UTC m=+2328.488898222" observedRunningTime="2025-10-07 17:42:05.999915236 +0000 UTC m=+2329.647326811" watchObservedRunningTime="2025-10-07 17:42:06.00376023 +0000 UTC m=+2329.651171785" Oct 07 17:42:06 crc kubenswrapper[4681]: I1007 17:42:06.028782 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:42:06 crc kubenswrapper[4681]: E1007 17:42:06.029098 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:42:10 crc kubenswrapper[4681]: I1007 17:42:10.587526 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:10 crc kubenswrapper[4681]: I1007 17:42:10.588018 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:10 crc kubenswrapper[4681]: I1007 17:42:10.634261 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:11 crc kubenswrapper[4681]: I1007 17:42:11.077215 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:11 crc kubenswrapper[4681]: I1007 17:42:11.130123 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.038929 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qwkx" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="registry-server" containerID="cri-o://acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577" gracePeriod=2 Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.527402 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.629630 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content\") pod \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.629710 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities\") pod \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.629758 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7d65\" (UniqueName: \"kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65\") pod \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\" (UID: \"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330\") " Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.632053 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities" (OuterVolumeSpecName: "utilities") pod "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" (UID: "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.643293 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" (UID: "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.644761 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65" (OuterVolumeSpecName: "kube-api-access-d7d65") pod "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" (UID: "9f9cb707-3dfd-4e5b-ad5b-e2b500f23330"). InnerVolumeSpecName "kube-api-access-d7d65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.732231 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.732459 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:13 crc kubenswrapper[4681]: I1007 17:42:13.732536 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7d65\" (UniqueName: \"kubernetes.io/projected/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330-kube-api-access-d7d65\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.163952 4681 generic.go:334] "Generic (PLEG): container finished" podID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerID="acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577" exitCode=0 Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.164219 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerDied","Data":"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577"} Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.164246 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qwkx" event={"ID":"9f9cb707-3dfd-4e5b-ad5b-e2b500f23330","Type":"ContainerDied","Data":"07ffd1122608ca2dbb2df242422a30e0d3f005ba477d0e25c658e3dd6827e0f2"} Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.164278 4681 scope.go:117] "RemoveContainer" containerID="acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.164435 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qwkx" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.194790 4681 scope.go:117] "RemoveContainer" containerID="a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.204185 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.221734 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qwkx"] Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.237547 4681 scope.go:117] "RemoveContainer" containerID="e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.264635 4681 scope.go:117] "RemoveContainer" containerID="acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577" Oct 07 17:42:14 crc kubenswrapper[4681]: E1007 17:42:14.265102 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577\": container with ID starting with acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577 not found: ID does not exist" containerID="acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.265157 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577"} err="failed to get container status \"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577\": rpc error: code = NotFound desc = could not find container \"acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577\": container with ID starting with acf87f8a48c4325803cb623c6da0c5ee7508549af8e8c953c6760f4149250577 not found: ID does not exist" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.265185 4681 scope.go:117] "RemoveContainer" containerID="a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf" Oct 07 17:42:14 crc kubenswrapper[4681]: E1007 17:42:14.265451 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf\": container with ID starting with a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf not found: ID does not exist" containerID="a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.265513 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf"} err="failed to get container status \"a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf\": rpc error: code = NotFound desc = could not find container \"a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf\": container with ID starting with a956660b6e4ea9053a2891604b5e27777fc3df3c4c588f61ca9b5a10918f8ebf not found: ID does not exist" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.265534 4681 scope.go:117] "RemoveContainer" containerID="e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c" Oct 07 17:42:14 crc kubenswrapper[4681]: E1007 17:42:14.265745 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c\": container with ID starting with e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c not found: ID does not exist" containerID="e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c" Oct 07 17:42:14 crc kubenswrapper[4681]: I1007 17:42:14.265790 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c"} err="failed to get container status \"e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c\": rpc error: code = NotFound desc = could not find container \"e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c\": container with ID starting with e46ec4fea5b68747303c99b6dd92c8628de7c99da229802a74e013267ffc4f8c not found: ID does not exist" Oct 07 17:42:15 crc kubenswrapper[4681]: I1007 17:42:15.041550 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" path="/var/lib/kubelet/pods/9f9cb707-3dfd-4e5b-ad5b-e2b500f23330/volumes" Oct 07 17:42:17 crc kubenswrapper[4681]: I1007 17:42:17.034772 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:42:17 crc kubenswrapper[4681]: E1007 17:42:17.035326 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:42:18 crc kubenswrapper[4681]: I1007 17:42:18.212957 4681 generic.go:334] "Generic (PLEG): container finished" podID="d1f9c32e-011c-49a9-8319-4aeb852fa976" containerID="20db18cc4e25328c8da4af47ef65a83f2d45d980c8a0fb58c176c451d946283e" exitCode=0 Oct 07 17:42:18 crc kubenswrapper[4681]: I1007 17:42:18.213135 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" event={"ID":"d1f9c32e-011c-49a9-8319-4aeb852fa976","Type":"ContainerDied","Data":"20db18cc4e25328c8da4af47ef65a83f2d45d980c8a0fb58c176c451d946283e"} Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.653490 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660008 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660111 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660219 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660243 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660328 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.660387 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6p7v\" (UniqueName: \"kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v\") pod \"d1f9c32e-011c-49a9-8319-4aeb852fa976\" (UID: \"d1f9c32e-011c-49a9-8319-4aeb852fa976\") " Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.675085 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v" (OuterVolumeSpecName: "kube-api-access-h6p7v") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "kube-api-access-h6p7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.675561 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.715496 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory" (OuterVolumeSpecName: "inventory") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.735088 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.735458 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.743129 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1f9c32e-011c-49a9-8319-4aeb852fa976" (UID: "d1f9c32e-011c-49a9-8319-4aeb852fa976"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762650 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6p7v\" (UniqueName: \"kubernetes.io/projected/d1f9c32e-011c-49a9-8319-4aeb852fa976-kube-api-access-h6p7v\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762687 4681 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762698 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762708 4681 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762720 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:19 crc kubenswrapper[4681]: I1007 17:42:19.762731 4681 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d1f9c32e-011c-49a9-8319-4aeb852fa976-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.232765 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" event={"ID":"d1f9c32e-011c-49a9-8319-4aeb852fa976","Type":"ContainerDied","Data":"773c1c94856d46aab8fe678085eec616f6803bd32192b19397f3e7325131667c"} Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.233051 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773c1c94856d46aab8fe678085eec616f6803bd32192b19397f3e7325131667c" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.232851 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.350231 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc"] Oct 07 17:42:20 crc kubenswrapper[4681]: E1007 17:42:20.350829 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="extract-utilities" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.351582 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="extract-utilities" Oct 07 17:42:20 crc kubenswrapper[4681]: E1007 17:42:20.351673 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f9c32e-011c-49a9-8319-4aeb852fa976" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.351753 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f9c32e-011c-49a9-8319-4aeb852fa976" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 17:42:20 crc kubenswrapper[4681]: E1007 17:42:20.351817 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="registry-server" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.351866 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="registry-server" Oct 07 17:42:20 crc kubenswrapper[4681]: E1007 17:42:20.351960 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="extract-content" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.352017 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="extract-content" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.352256 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9cb707-3dfd-4e5b-ad5b-e2b500f23330" containerName="registry-server" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.352341 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f9c32e-011c-49a9-8319-4aeb852fa976" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.353122 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.357963 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.358348 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.358376 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.359786 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.362262 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.369308 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc"] Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.498673 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.498732 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.499450 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhrg\" (UniqueName: \"kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.499477 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.499497 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.609210 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.609274 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.609357 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhrg\" (UniqueName: \"kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.609377 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.609400 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.614082 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.614267 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.614579 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.615696 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.643217 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhrg\" (UniqueName: \"kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-djlbc\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:20 crc kubenswrapper[4681]: I1007 17:42:20.672097 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:42:21 crc kubenswrapper[4681]: I1007 17:42:21.182495 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc"] Oct 07 17:42:21 crc kubenswrapper[4681]: I1007 17:42:21.242622 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" event={"ID":"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d","Type":"ContainerStarted","Data":"02d7c20cc4742faaef48cebf760eaa98268af1d05f7a95cbf8ccf82a2b7b6b26"} Oct 07 17:42:22 crc kubenswrapper[4681]: I1007 17:42:22.254441 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" event={"ID":"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d","Type":"ContainerStarted","Data":"c74cdecbf881c92d3ff18f139db52a532c1c46db0254e28a98381e312db468e6"} Oct 07 17:42:22 crc kubenswrapper[4681]: I1007 17:42:22.271088 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" podStartSLOduration=2.08967975 podStartE2EDuration="2.27107243s" podCreationTimestamp="2025-10-07 17:42:20 +0000 UTC" firstStartedPulling="2025-10-07 17:42:21.190764729 +0000 UTC m=+2344.838176274" lastFinishedPulling="2025-10-07 17:42:21.372157389 +0000 UTC m=+2345.019568954" observedRunningTime="2025-10-07 17:42:22.268831869 +0000 UTC m=+2345.916243424" watchObservedRunningTime="2025-10-07 17:42:22.27107243 +0000 UTC m=+2345.918483985" Oct 07 17:42:28 crc kubenswrapper[4681]: I1007 17:42:28.029765 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:42:28 crc kubenswrapper[4681]: E1007 17:42:28.030482 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:42:43 crc kubenswrapper[4681]: I1007 17:42:43.029390 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:42:43 crc kubenswrapper[4681]: E1007 17:42:43.030135 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:42:55 crc kubenswrapper[4681]: I1007 17:42:55.029675 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:42:55 crc kubenswrapper[4681]: E1007 17:42:55.030496 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:43:10 crc kubenswrapper[4681]: I1007 17:43:10.029970 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:43:10 crc kubenswrapper[4681]: E1007 17:43:10.032273 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:43:22 crc kubenswrapper[4681]: I1007 17:43:22.029029 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:43:22 crc kubenswrapper[4681]: E1007 17:43:22.030262 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:43:35 crc kubenswrapper[4681]: I1007 17:43:35.029096 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:43:35 crc kubenswrapper[4681]: E1007 17:43:35.030771 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:43:49 crc kubenswrapper[4681]: I1007 17:43:49.029264 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:43:49 crc kubenswrapper[4681]: E1007 17:43:49.029968 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:01 crc kubenswrapper[4681]: I1007 17:44:01.029357 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:01 crc kubenswrapper[4681]: E1007 17:44:01.029998 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:12 crc kubenswrapper[4681]: I1007 17:44:12.030127 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:12 crc kubenswrapper[4681]: E1007 17:44:12.031736 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:23 crc kubenswrapper[4681]: I1007 17:44:23.030082 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:23 crc kubenswrapper[4681]: E1007 17:44:23.030914 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:35 crc kubenswrapper[4681]: I1007 17:44:35.032367 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:35 crc kubenswrapper[4681]: E1007 17:44:35.033232 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.349102 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.351587 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.426354 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.463325 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.463701 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.463745 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qb67\" (UniqueName: \"kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.565431 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.565590 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.565613 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qb67\" (UniqueName: \"kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.565902 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.566034 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.591150 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qb67\" (UniqueName: \"kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67\") pod \"redhat-operators-zz6xn\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.677742 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:39 crc kubenswrapper[4681]: I1007 17:44:39.943741 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:44:40 crc kubenswrapper[4681]: I1007 17:44:40.556260 4681 generic.go:334] "Generic (PLEG): container finished" podID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerID="cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b" exitCode=0 Oct 07 17:44:40 crc kubenswrapper[4681]: I1007 17:44:40.556347 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerDied","Data":"cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b"} Oct 07 17:44:40 crc kubenswrapper[4681]: I1007 17:44:40.556782 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerStarted","Data":"b2ad27d21a51d15e88575714840f3ac33602b924c294123774064c50bd7a1277"} Oct 07 17:44:40 crc kubenswrapper[4681]: I1007 17:44:40.558419 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:44:42 crc kubenswrapper[4681]: I1007 17:44:42.576394 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerStarted","Data":"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730"} Oct 07 17:44:45 crc kubenswrapper[4681]: I1007 17:44:45.603508 4681 generic.go:334] "Generic (PLEG): container finished" podID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerID="f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730" exitCode=0 Oct 07 17:44:45 crc kubenswrapper[4681]: I1007 17:44:45.603588 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerDied","Data":"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730"} Oct 07 17:44:46 crc kubenswrapper[4681]: I1007 17:44:46.623217 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerStarted","Data":"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e"} Oct 07 17:44:46 crc kubenswrapper[4681]: I1007 17:44:46.641567 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zz6xn" podStartSLOduration=2.179692842 podStartE2EDuration="7.641551089s" podCreationTimestamp="2025-10-07 17:44:39 +0000 UTC" firstStartedPulling="2025-10-07 17:44:40.558233288 +0000 UTC m=+2484.205644843" lastFinishedPulling="2025-10-07 17:44:46.020091525 +0000 UTC m=+2489.667503090" observedRunningTime="2025-10-07 17:44:46.63900288 +0000 UTC m=+2490.286414445" watchObservedRunningTime="2025-10-07 17:44:46.641551089 +0000 UTC m=+2490.288962644" Oct 07 17:44:47 crc kubenswrapper[4681]: I1007 17:44:47.036514 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:47 crc kubenswrapper[4681]: E1007 17:44:47.036878 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:49 crc kubenswrapper[4681]: I1007 17:44:49.678034 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:49 crc kubenswrapper[4681]: I1007 17:44:49.678951 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:50 crc kubenswrapper[4681]: I1007 17:44:50.734797 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zz6xn" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="registry-server" probeResult="failure" output=< Oct 07 17:44:50 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:44:50 crc kubenswrapper[4681]: > Oct 07 17:44:59 crc kubenswrapper[4681]: I1007 17:44:59.029254 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:44:59 crc kubenswrapper[4681]: E1007 17:44:59.030955 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:44:59 crc kubenswrapper[4681]: I1007 17:44:59.727842 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:59 crc kubenswrapper[4681]: I1007 17:44:59.784581 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:44:59 crc kubenswrapper[4681]: I1007 17:44:59.978322 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.150218 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz"] Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.151504 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.161310 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.161403 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.168434 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz"] Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.270357 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brf5x\" (UniqueName: \"kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.270434 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.270634 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.372609 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.372979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brf5x\" (UniqueName: \"kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.373084 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.373506 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.379001 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.390716 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brf5x\" (UniqueName: \"kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x\") pod \"collect-profiles-29330985-mj5jz\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.533576 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:00 crc kubenswrapper[4681]: I1007 17:45:00.964037 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz"] Oct 07 17:45:00 crc kubenswrapper[4681]: W1007 17:45:00.969682 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1f5432a_f229_4146_93f1_d053b31f680a.slice/crio-8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df WatchSource:0}: Error finding container 8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df: Status 404 returned error can't find the container with id 8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df Oct 07 17:45:01 crc kubenswrapper[4681]: I1007 17:45:01.754493 4681 generic.go:334] "Generic (PLEG): container finished" podID="e1f5432a-f229-4146-93f1-d053b31f680a" containerID="06573a68eacfcfce60cff64ce45d16aac7bd144443fceb9d68cd78b91fda4c4b" exitCode=0 Oct 07 17:45:01 crc kubenswrapper[4681]: I1007 17:45:01.754979 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zz6xn" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="registry-server" containerID="cri-o://18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e" gracePeriod=2 Oct 07 17:45:01 crc kubenswrapper[4681]: I1007 17:45:01.754701 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" event={"ID":"e1f5432a-f229-4146-93f1-d053b31f680a","Type":"ContainerDied","Data":"06573a68eacfcfce60cff64ce45d16aac7bd144443fceb9d68cd78b91fda4c4b"} Oct 07 17:45:01 crc kubenswrapper[4681]: I1007 17:45:01.755040 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" event={"ID":"e1f5432a-f229-4146-93f1-d053b31f680a","Type":"ContainerStarted","Data":"8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df"} Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.329315 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.409817 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities\") pod \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.410373 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qb67\" (UniqueName: \"kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67\") pod \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.410424 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content\") pod \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\" (UID: \"9749ca62-a0b9-4d29-a336-68b367fbd7c5\") " Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.410706 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities" (OuterVolumeSpecName: "utilities") pod "9749ca62-a0b9-4d29-a336-68b367fbd7c5" (UID: "9749ca62-a0b9-4d29-a336-68b367fbd7c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.411375 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.415961 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67" (OuterVolumeSpecName: "kube-api-access-2qb67") pod "9749ca62-a0b9-4d29-a336-68b367fbd7c5" (UID: "9749ca62-a0b9-4d29-a336-68b367fbd7c5"). InnerVolumeSpecName "kube-api-access-2qb67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.495193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9749ca62-a0b9-4d29-a336-68b367fbd7c5" (UID: "9749ca62-a0b9-4d29-a336-68b367fbd7c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.513443 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qb67\" (UniqueName: \"kubernetes.io/projected/9749ca62-a0b9-4d29-a336-68b367fbd7c5-kube-api-access-2qb67\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.513474 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9749ca62-a0b9-4d29-a336-68b367fbd7c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.766923 4681 generic.go:334] "Generic (PLEG): container finished" podID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerID="18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e" exitCode=0 Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.767012 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz6xn" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.767054 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerDied","Data":"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e"} Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.767088 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz6xn" event={"ID":"9749ca62-a0b9-4d29-a336-68b367fbd7c5","Type":"ContainerDied","Data":"b2ad27d21a51d15e88575714840f3ac33602b924c294123774064c50bd7a1277"} Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.767105 4681 scope.go:117] "RemoveContainer" containerID="18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.804160 4681 scope.go:117] "RemoveContainer" containerID="f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.818252 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.828714 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zz6xn"] Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.836118 4681 scope.go:117] "RemoveContainer" containerID="cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.869533 4681 scope.go:117] "RemoveContainer" containerID="18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e" Oct 07 17:45:02 crc kubenswrapper[4681]: E1007 17:45:02.870840 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e\": container with ID starting with 18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e not found: ID does not exist" containerID="18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.870893 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e"} err="failed to get container status \"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e\": rpc error: code = NotFound desc = could not find container \"18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e\": container with ID starting with 18d9f120bfc68ba3e49654c0f1ff9e06fa21972dae03b590d9fa94e6de493d2e not found: ID does not exist" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.870916 4681 scope.go:117] "RemoveContainer" containerID="f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730" Oct 07 17:45:02 crc kubenswrapper[4681]: E1007 17:45:02.871335 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730\": container with ID starting with f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730 not found: ID does not exist" containerID="f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.871384 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730"} err="failed to get container status \"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730\": rpc error: code = NotFound desc = could not find container \"f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730\": container with ID starting with f412c851de28202d6def47b0cc95ea2e1449651605b762abaed0105d90088730 not found: ID does not exist" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.871417 4681 scope.go:117] "RemoveContainer" containerID="cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b" Oct 07 17:45:02 crc kubenswrapper[4681]: E1007 17:45:02.871800 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b\": container with ID starting with cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b not found: ID does not exist" containerID="cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b" Oct 07 17:45:02 crc kubenswrapper[4681]: I1007 17:45:02.871825 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b"} err="failed to get container status \"cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b\": rpc error: code = NotFound desc = could not find container \"cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b\": container with ID starting with cfb7e5aecad1d33eec9d92333e03104ffd39d292c9a32e1c8699ee75fafa4e4b not found: ID does not exist" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.040429 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" path="/var/lib/kubelet/pods/9749ca62-a0b9-4d29-a336-68b367fbd7c5/volumes" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.083492 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.125343 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brf5x\" (UniqueName: \"kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x\") pod \"e1f5432a-f229-4146-93f1-d053b31f680a\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.125404 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume\") pod \"e1f5432a-f229-4146-93f1-d053b31f680a\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.125438 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume\") pod \"e1f5432a-f229-4146-93f1-d053b31f680a\" (UID: \"e1f5432a-f229-4146-93f1-d053b31f680a\") " Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.127242 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume" (OuterVolumeSpecName: "config-volume") pod "e1f5432a-f229-4146-93f1-d053b31f680a" (UID: "e1f5432a-f229-4146-93f1-d053b31f680a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.130650 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e1f5432a-f229-4146-93f1-d053b31f680a" (UID: "e1f5432a-f229-4146-93f1-d053b31f680a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.136049 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x" (OuterVolumeSpecName: "kube-api-access-brf5x") pod "e1f5432a-f229-4146-93f1-d053b31f680a" (UID: "e1f5432a-f229-4146-93f1-d053b31f680a"). InnerVolumeSpecName "kube-api-access-brf5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.227418 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brf5x\" (UniqueName: \"kubernetes.io/projected/e1f5432a-f229-4146-93f1-d053b31f680a-kube-api-access-brf5x\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.227451 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1f5432a-f229-4146-93f1-d053b31f680a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.227460 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1f5432a-f229-4146-93f1-d053b31f680a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.778629 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" event={"ID":"e1f5432a-f229-4146-93f1-d053b31f680a","Type":"ContainerDied","Data":"8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df"} Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.778954 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f14377280115c6658f006e7e3cacfdec78ad0b6a4a3ac67524feab7e1a123df" Oct 07 17:45:03 crc kubenswrapper[4681]: I1007 17:45:03.778688 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz" Oct 07 17:45:04 crc kubenswrapper[4681]: I1007 17:45:04.157290 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5"] Oct 07 17:45:04 crc kubenswrapper[4681]: I1007 17:45:04.165049 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330940-w7fp5"] Oct 07 17:45:05 crc kubenswrapper[4681]: I1007 17:45:05.043087 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc28b46-2a9f-4141-8e65-a9c956e0f261" path="/var/lib/kubelet/pods/abc28b46-2a9f-4141-8e65-a9c956e0f261/volumes" Oct 07 17:45:11 crc kubenswrapper[4681]: I1007 17:45:11.029111 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:45:11 crc kubenswrapper[4681]: E1007 17:45:11.029840 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:45:24 crc kubenswrapper[4681]: I1007 17:45:24.030046 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:45:24 crc kubenswrapper[4681]: I1007 17:45:24.971932 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919"} Oct 07 17:45:36 crc kubenswrapper[4681]: I1007 17:45:36.322648 4681 scope.go:117] "RemoveContainer" containerID="86c8f25c68b4cd646ad8682cad9f74bc3d777bdb6ed7b62f93e0ab1890d5d373" Oct 07 17:47:42 crc kubenswrapper[4681]: I1007 17:47:42.195166 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:47:42 crc kubenswrapper[4681]: I1007 17:47:42.195739 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:48:06 crc kubenswrapper[4681]: I1007 17:48:06.350993 4681 generic.go:334] "Generic (PLEG): container finished" podID="3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" containerID="c74cdecbf881c92d3ff18f139db52a532c1c46db0254e28a98381e312db468e6" exitCode=0 Oct 07 17:48:06 crc kubenswrapper[4681]: I1007 17:48:06.351101 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" event={"ID":"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d","Type":"ContainerDied","Data":"c74cdecbf881c92d3ff18f139db52a532c1c46db0254e28a98381e312db468e6"} Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.796472 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.823327 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0\") pod \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.823391 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle\") pod \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.823448 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key\") pod \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.823491 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lhrg\" (UniqueName: \"kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg\") pod \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.823521 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory\") pod \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\" (UID: \"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d\") " Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.839838 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg" (OuterVolumeSpecName: "kube-api-access-2lhrg") pod "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" (UID: "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d"). InnerVolumeSpecName "kube-api-access-2lhrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.840169 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" (UID: "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.858092 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" (UID: "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.859148 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" (UID: "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.866123 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory" (OuterVolumeSpecName: "inventory") pod "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" (UID: "3c08afe2-1291-4ac9-8eb5-493f9cff1c4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.926411 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.926644 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lhrg\" (UniqueName: \"kubernetes.io/projected/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-kube-api-access-2lhrg\") on node \"crc\" DevicePath \"\"" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.926769 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.926858 4681 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:48:07 crc kubenswrapper[4681]: I1007 17:48:07.926973 4681 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c08afe2-1291-4ac9-8eb5-493f9cff1c4d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.372852 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" event={"ID":"3c08afe2-1291-4ac9-8eb5-493f9cff1c4d","Type":"ContainerDied","Data":"02d7c20cc4742faaef48cebf760eaa98268af1d05f7a95cbf8ccf82a2b7b6b26"} Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.372973 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d7c20cc4742faaef48cebf760eaa98268af1d05f7a95cbf8ccf82a2b7b6b26" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.372975 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-djlbc" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493326 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9"] Oct 07 17:48:08 crc kubenswrapper[4681]: E1007 17:48:08.493722 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f5432a-f229-4146-93f1-d053b31f680a" containerName="collect-profiles" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493738 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f5432a-f229-4146-93f1-d053b31f680a" containerName="collect-profiles" Oct 07 17:48:08 crc kubenswrapper[4681]: E1007 17:48:08.493750 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="extract-utilities" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493757 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="extract-utilities" Oct 07 17:48:08 crc kubenswrapper[4681]: E1007 17:48:08.493784 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="extract-content" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493790 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="extract-content" Oct 07 17:48:08 crc kubenswrapper[4681]: E1007 17:48:08.493806 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493812 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 17:48:08 crc kubenswrapper[4681]: E1007 17:48:08.493826 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="registry-server" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.493832 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="registry-server" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.494055 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f5432a-f229-4146-93f1-d053b31f680a" containerName="collect-profiles" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.494070 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9749ca62-a0b9-4d29-a336-68b367fbd7c5" containerName="registry-server" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.494082 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c08afe2-1291-4ac9-8eb5-493f9cff1c4d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.494729 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.506460 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.506580 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.506696 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.506715 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.507007 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.507587 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9"] Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.510770 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.511021 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538310 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfpkx\" (UniqueName: \"kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538388 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538433 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538452 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538481 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538519 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538541 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538577 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.538591 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641012 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfpkx\" (UniqueName: \"kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641075 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641101 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641120 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641152 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641194 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641222 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641260 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.641277 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.643139 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.646266 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.646716 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.647151 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.647201 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.647541 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.648359 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.649738 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.667012 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfpkx\" (UniqueName: \"kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rk6x9\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:08 crc kubenswrapper[4681]: I1007 17:48:08.827939 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:48:09 crc kubenswrapper[4681]: I1007 17:48:09.371169 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9"] Oct 07 17:48:10 crc kubenswrapper[4681]: I1007 17:48:10.391663 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" event={"ID":"a7d237e9-d752-4244-8f32-be01a5ca3f6f","Type":"ContainerStarted","Data":"bc6517b0cb29aad4fbfd92dde2e3be4bec0fa4c71c581211128620976c7ce2d5"} Oct 07 17:48:10 crc kubenswrapper[4681]: I1007 17:48:10.392027 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" event={"ID":"a7d237e9-d752-4244-8f32-be01a5ca3f6f","Type":"ContainerStarted","Data":"71f4858d12f90ad9f0f1a4ba5973a33d9051633cd8c42a53e540516b1ae33414"} Oct 07 17:48:10 crc kubenswrapper[4681]: I1007 17:48:10.414720 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" podStartSLOduration=2.222412912 podStartE2EDuration="2.414692778s" podCreationTimestamp="2025-10-07 17:48:08 +0000 UTC" firstStartedPulling="2025-10-07 17:48:09.378412159 +0000 UTC m=+2693.025823714" lastFinishedPulling="2025-10-07 17:48:09.570692005 +0000 UTC m=+2693.218103580" observedRunningTime="2025-10-07 17:48:10.411272074 +0000 UTC m=+2694.058683629" watchObservedRunningTime="2025-10-07 17:48:10.414692778 +0000 UTC m=+2694.062104353" Oct 07 17:48:12 crc kubenswrapper[4681]: I1007 17:48:12.195806 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:48:12 crc kubenswrapper[4681]: I1007 17:48:12.196128 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.195848 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.196530 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.196591 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.197657 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.197755 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919" gracePeriod=600 Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.685247 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919" exitCode=0 Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.685337 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919"} Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.685624 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b"} Oct 07 17:48:42 crc kubenswrapper[4681]: I1007 17:48:42.685650 4681 scope.go:117] "RemoveContainer" containerID="0666d73f8f7dda0b39557cc16cf0a0782031e758e0209bc830bf088885925b11" Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.783109 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.785992 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.813151 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.906062 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkhw\" (UniqueName: \"kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.906489 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:10 crc kubenswrapper[4681]: I1007 17:49:10.906961 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.008353 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkhw\" (UniqueName: \"kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.008448 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.008530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.008952 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.009036 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.027979 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkhw\" (UniqueName: \"kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw\") pod \"community-operators-m4cj7\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.110570 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.709911 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.940824 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerStarted","Data":"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598"} Oct 07 17:49:11 crc kubenswrapper[4681]: I1007 17:49:11.941182 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerStarted","Data":"69f39f552ea4e22f89e6e66098447e3bddae5f9c4b29d3156527457d8c55dd56"} Oct 07 17:49:12 crc kubenswrapper[4681]: I1007 17:49:12.951749 4681 generic.go:334] "Generic (PLEG): container finished" podID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerID="ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598" exitCode=0 Oct 07 17:49:12 crc kubenswrapper[4681]: I1007 17:49:12.951840 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerDied","Data":"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598"} Oct 07 17:49:13 crc kubenswrapper[4681]: I1007 17:49:13.990132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerStarted","Data":"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55"} Oct 07 17:49:15 crc kubenswrapper[4681]: I1007 17:49:15.001688 4681 generic.go:334] "Generic (PLEG): container finished" podID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerID="2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55" exitCode=0 Oct 07 17:49:15 crc kubenswrapper[4681]: I1007 17:49:15.001748 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerDied","Data":"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55"} Oct 07 17:49:16 crc kubenswrapper[4681]: I1007 17:49:16.012335 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerStarted","Data":"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb"} Oct 07 17:49:16 crc kubenswrapper[4681]: I1007 17:49:16.030968 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m4cj7" podStartSLOduration=2.546232926 podStartE2EDuration="6.030951293s" podCreationTimestamp="2025-10-07 17:49:10 +0000 UTC" firstStartedPulling="2025-10-07 17:49:11.947316666 +0000 UTC m=+2755.594728231" lastFinishedPulling="2025-10-07 17:49:15.432035043 +0000 UTC m=+2759.079446598" observedRunningTime="2025-10-07 17:49:16.026574991 +0000 UTC m=+2759.673986546" watchObservedRunningTime="2025-10-07 17:49:16.030951293 +0000 UTC m=+2759.678362848" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.387960 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.390784 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.404322 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.559250 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.559589 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.559712 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wvgg\" (UniqueName: \"kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.661061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wvgg\" (UniqueName: \"kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.661134 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.661173 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.661628 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.662521 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.685566 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wvgg\" (UniqueName: \"kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg\") pod \"certified-operators-v7nsr\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:17 crc kubenswrapper[4681]: I1007 17:49:17.727802 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:18 crc kubenswrapper[4681]: I1007 17:49:18.382789 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:18 crc kubenswrapper[4681]: W1007 17:49:18.387753 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48717f89_70f6_4e1e_ba52_77ba39cd7929.slice/crio-f26ce995b5116f947bcbde699f7e7426bea3ecb9d6cec291cf88ca5b8309d868 WatchSource:0}: Error finding container f26ce995b5116f947bcbde699f7e7426bea3ecb9d6cec291cf88ca5b8309d868: Status 404 returned error can't find the container with id f26ce995b5116f947bcbde699f7e7426bea3ecb9d6cec291cf88ca5b8309d868 Oct 07 17:49:19 crc kubenswrapper[4681]: I1007 17:49:19.041853 4681 generic.go:334] "Generic (PLEG): container finished" podID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerID="4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32" exitCode=0 Oct 07 17:49:19 crc kubenswrapper[4681]: I1007 17:49:19.043244 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerDied","Data":"4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32"} Oct 07 17:49:19 crc kubenswrapper[4681]: I1007 17:49:19.043628 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerStarted","Data":"f26ce995b5116f947bcbde699f7e7426bea3ecb9d6cec291cf88ca5b8309d868"} Oct 07 17:49:20 crc kubenswrapper[4681]: I1007 17:49:20.052549 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerStarted","Data":"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811"} Oct 07 17:49:21 crc kubenswrapper[4681]: I1007 17:49:21.111564 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:21 crc kubenswrapper[4681]: I1007 17:49:21.111624 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:21 crc kubenswrapper[4681]: I1007 17:49:21.176429 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:22 crc kubenswrapper[4681]: I1007 17:49:22.071967 4681 generic.go:334] "Generic (PLEG): container finished" podID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerID="aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811" exitCode=0 Oct 07 17:49:22 crc kubenswrapper[4681]: I1007 17:49:22.072058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerDied","Data":"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811"} Oct 07 17:49:22 crc kubenswrapper[4681]: I1007 17:49:22.127191 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:23 crc kubenswrapper[4681]: I1007 17:49:23.082400 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerStarted","Data":"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb"} Oct 07 17:49:23 crc kubenswrapper[4681]: I1007 17:49:23.111925 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7nsr" podStartSLOduration=2.40853897 podStartE2EDuration="6.111909595s" podCreationTimestamp="2025-10-07 17:49:17 +0000 UTC" firstStartedPulling="2025-10-07 17:49:19.044084558 +0000 UTC m=+2762.691496113" lastFinishedPulling="2025-10-07 17:49:22.747455173 +0000 UTC m=+2766.394866738" observedRunningTime="2025-10-07 17:49:23.104027545 +0000 UTC m=+2766.751439170" watchObservedRunningTime="2025-10-07 17:49:23.111909595 +0000 UTC m=+2766.759321150" Oct 07 17:49:24 crc kubenswrapper[4681]: I1007 17:49:24.554583 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:24 crc kubenswrapper[4681]: I1007 17:49:24.555062 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m4cj7" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="registry-server" containerID="cri-o://d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb" gracePeriod=2 Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.019572 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.101506 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kkhw\" (UniqueName: \"kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw\") pod \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.101923 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities\") pod \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.101949 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content\") pod \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\" (UID: \"5d3b32a5-797b-4f04-bdcb-9976eddaeb48\") " Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.102433 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities" (OuterVolumeSpecName: "utilities") pod "5d3b32a5-797b-4f04-bdcb-9976eddaeb48" (UID: "5d3b32a5-797b-4f04-bdcb-9976eddaeb48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.103231 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.105599 4681 generic.go:334] "Generic (PLEG): container finished" podID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerID="d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb" exitCode=0 Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.105625 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerDied","Data":"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb"} Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.105649 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4cj7" event={"ID":"5d3b32a5-797b-4f04-bdcb-9976eddaeb48","Type":"ContainerDied","Data":"69f39f552ea4e22f89e6e66098447e3bddae5f9c4b29d3156527457d8c55dd56"} Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.105666 4681 scope.go:117] "RemoveContainer" containerID="d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.105773 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4cj7" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.108842 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw" (OuterVolumeSpecName: "kube-api-access-9kkhw") pod "5d3b32a5-797b-4f04-bdcb-9976eddaeb48" (UID: "5d3b32a5-797b-4f04-bdcb-9976eddaeb48"). InnerVolumeSpecName "kube-api-access-9kkhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.167951 4681 scope.go:117] "RemoveContainer" containerID="2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.171852 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d3b32a5-797b-4f04-bdcb-9976eddaeb48" (UID: "5d3b32a5-797b-4f04-bdcb-9976eddaeb48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.187599 4681 scope.go:117] "RemoveContainer" containerID="ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.205443 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.205481 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kkhw\" (UniqueName: \"kubernetes.io/projected/5d3b32a5-797b-4f04-bdcb-9976eddaeb48-kube-api-access-9kkhw\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.243901 4681 scope.go:117] "RemoveContainer" containerID="d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb" Oct 07 17:49:25 crc kubenswrapper[4681]: E1007 17:49:25.244527 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb\": container with ID starting with d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb not found: ID does not exist" containerID="d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.244568 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb"} err="failed to get container status \"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb\": rpc error: code = NotFound desc = could not find container \"d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb\": container with ID starting with d0be9993b1bfbcb0c8f12060f7f2357499ec8e721e1630fbdfbf6a870dca87eb not found: ID does not exist" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.244595 4681 scope.go:117] "RemoveContainer" containerID="2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55" Oct 07 17:49:25 crc kubenswrapper[4681]: E1007 17:49:25.244943 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55\": container with ID starting with 2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55 not found: ID does not exist" containerID="2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.244973 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55"} err="failed to get container status \"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55\": rpc error: code = NotFound desc = could not find container \"2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55\": container with ID starting with 2c093f5812faf2276926c788cfc50432fcf9c433d85a493cac5a77a8dff62b55 not found: ID does not exist" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.244995 4681 scope.go:117] "RemoveContainer" containerID="ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598" Oct 07 17:49:25 crc kubenswrapper[4681]: E1007 17:49:25.245360 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598\": container with ID starting with ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598 not found: ID does not exist" containerID="ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.245385 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598"} err="failed to get container status \"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598\": rpc error: code = NotFound desc = could not find container \"ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598\": container with ID starting with ee6294ab52bc235a597ff818b6ced5d0ed6698eeaacd4a80eaee164db71fe598 not found: ID does not exist" Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.437126 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:25 crc kubenswrapper[4681]: I1007 17:49:25.444946 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m4cj7"] Oct 07 17:49:27 crc kubenswrapper[4681]: I1007 17:49:27.039351 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" path="/var/lib/kubelet/pods/5d3b32a5-797b-4f04-bdcb-9976eddaeb48/volumes" Oct 07 17:49:27 crc kubenswrapper[4681]: I1007 17:49:27.728522 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:27 crc kubenswrapper[4681]: I1007 17:49:27.728585 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:27 crc kubenswrapper[4681]: I1007 17:49:27.790715 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:28 crc kubenswrapper[4681]: I1007 17:49:28.178686 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:28 crc kubenswrapper[4681]: I1007 17:49:28.555435 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.148496 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7nsr" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="registry-server" containerID="cri-o://c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb" gracePeriod=2 Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.607391 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.701806 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wvgg\" (UniqueName: \"kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg\") pod \"48717f89-70f6-4e1e-ba52-77ba39cd7929\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.702160 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content\") pod \"48717f89-70f6-4e1e-ba52-77ba39cd7929\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.702361 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities\") pod \"48717f89-70f6-4e1e-ba52-77ba39cd7929\" (UID: \"48717f89-70f6-4e1e-ba52-77ba39cd7929\") " Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.703983 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities" (OuterVolumeSpecName: "utilities") pod "48717f89-70f6-4e1e-ba52-77ba39cd7929" (UID: "48717f89-70f6-4e1e-ba52-77ba39cd7929"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.709249 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg" (OuterVolumeSpecName: "kube-api-access-7wvgg") pod "48717f89-70f6-4e1e-ba52-77ba39cd7929" (UID: "48717f89-70f6-4e1e-ba52-77ba39cd7929"). InnerVolumeSpecName "kube-api-access-7wvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.753084 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48717f89-70f6-4e1e-ba52-77ba39cd7929" (UID: "48717f89-70f6-4e1e-ba52-77ba39cd7929"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.804448 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.804486 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wvgg\" (UniqueName: \"kubernetes.io/projected/48717f89-70f6-4e1e-ba52-77ba39cd7929-kube-api-access-7wvgg\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:30 crc kubenswrapper[4681]: I1007 17:49:30.804498 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48717f89-70f6-4e1e-ba52-77ba39cd7929-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.158309 4681 generic.go:334] "Generic (PLEG): container finished" podID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerID="c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb" exitCode=0 Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.158543 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7nsr" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.158559 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerDied","Data":"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb"} Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.159501 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7nsr" event={"ID":"48717f89-70f6-4e1e-ba52-77ba39cd7929","Type":"ContainerDied","Data":"f26ce995b5116f947bcbde699f7e7426bea3ecb9d6cec291cf88ca5b8309d868"} Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.159523 4681 scope.go:117] "RemoveContainer" containerID="c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.183033 4681 scope.go:117] "RemoveContainer" containerID="aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.185990 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.192224 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7nsr"] Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.242046 4681 scope.go:117] "RemoveContainer" containerID="4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.285268 4681 scope.go:117] "RemoveContainer" containerID="c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb" Oct 07 17:49:31 crc kubenswrapper[4681]: E1007 17:49:31.285972 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb\": container with ID starting with c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb not found: ID does not exist" containerID="c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.286046 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb"} err="failed to get container status \"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb\": rpc error: code = NotFound desc = could not find container \"c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb\": container with ID starting with c97c0483479c8155a37bc0bd0288b86617b6e41eaaa6de05e3133de6ad08eecb not found: ID does not exist" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.286112 4681 scope.go:117] "RemoveContainer" containerID="aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811" Oct 07 17:49:31 crc kubenswrapper[4681]: E1007 17:49:31.286721 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811\": container with ID starting with aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811 not found: ID does not exist" containerID="aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.286757 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811"} err="failed to get container status \"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811\": rpc error: code = NotFound desc = could not find container \"aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811\": container with ID starting with aea5a8cd0858f893f5c8305aec0268130e0bc676c02f0b30b44d7a48f5185811 not found: ID does not exist" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.286797 4681 scope.go:117] "RemoveContainer" containerID="4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32" Oct 07 17:49:31 crc kubenswrapper[4681]: E1007 17:49:31.287085 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32\": container with ID starting with 4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32 not found: ID does not exist" containerID="4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32" Oct 07 17:49:31 crc kubenswrapper[4681]: I1007 17:49:31.287112 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32"} err="failed to get container status \"4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32\": rpc error: code = NotFound desc = could not find container \"4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32\": container with ID starting with 4901cbb768bf46f7eeae42866bcc83d07ff87cd06f5723b146fb2874bd32ca32 not found: ID does not exist" Oct 07 17:49:33 crc kubenswrapper[4681]: I1007 17:49:33.042830 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" path="/var/lib/kubelet/pods/48717f89-70f6-4e1e-ba52-77ba39cd7929/volumes" Oct 07 17:50:41 crc kubenswrapper[4681]: I1007 17:50:41.735694 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-58b7954b47-8j9j9" podUID="642b1a07-3c90-40b5-b6cb-af1d8832649b" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 07 17:50:42 crc kubenswrapper[4681]: I1007 17:50:42.195695 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:50:42 crc kubenswrapper[4681]: I1007 17:50:42.195756 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:51:12 crc kubenswrapper[4681]: I1007 17:51:12.194758 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:51:12 crc kubenswrapper[4681]: I1007 17:51:12.195463 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.195108 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.195557 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.195598 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.196256 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.196339 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" gracePeriod=600 Oct 07 17:51:42 crc kubenswrapper[4681]: E1007 17:51:42.333722 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.343707 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" exitCode=0 Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.343746 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b"} Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.343776 4681 scope.go:117] "RemoveContainer" containerID="4a44aa30a0b57e231fa4a946a2c711605ee1432e6ff6cac75f3ee512920db919" Oct 07 17:51:42 crc kubenswrapper[4681]: I1007 17:51:42.344334 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:51:42 crc kubenswrapper[4681]: E1007 17:51:42.344562 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:51:56 crc kubenswrapper[4681]: I1007 17:51:56.029223 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:51:56 crc kubenswrapper[4681]: E1007 17:51:56.029933 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:52:00 crc kubenswrapper[4681]: I1007 17:52:00.538966 4681 generic.go:334] "Generic (PLEG): container finished" podID="a7d237e9-d752-4244-8f32-be01a5ca3f6f" containerID="bc6517b0cb29aad4fbfd92dde2e3be4bec0fa4c71c581211128620976c7ce2d5" exitCode=0 Oct 07 17:52:00 crc kubenswrapper[4681]: I1007 17:52:00.539262 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" event={"ID":"a7d237e9-d752-4244-8f32-be01a5ca3f6f","Type":"ContainerDied","Data":"bc6517b0cb29aad4fbfd92dde2e3be4bec0fa4c71c581211128620976c7ce2d5"} Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.052106 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146189 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146258 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146283 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfpkx\" (UniqueName: \"kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146352 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146369 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146414 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146444 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146544 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.146623 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0\") pod \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\" (UID: \"a7d237e9-d752-4244-8f32-be01a5ca3f6f\") " Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.160513 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.170533 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx" (OuterVolumeSpecName: "kube-api-access-mfpkx") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "kube-api-access-mfpkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.179574 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.184910 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory" (OuterVolumeSpecName: "inventory") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.190099 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.193044 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.196119 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.203914 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.206215 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7d237e9-d752-4244-8f32-be01a5ca3f6f" (UID: "a7d237e9-d752-4244-8f32-be01a5ca3f6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.249742 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.249830 4681 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.249999 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfpkx\" (UniqueName: \"kubernetes.io/projected/a7d237e9-d752-4244-8f32-be01a5ca3f6f-kube-api-access-mfpkx\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250020 4681 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250036 4681 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250337 4681 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250366 4681 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250418 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7d237e9-d752-4244-8f32-be01a5ca3f6f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.250568 4681 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a7d237e9-d752-4244-8f32-be01a5ca3f6f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.557427 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" event={"ID":"a7d237e9-d752-4244-8f32-be01a5ca3f6f","Type":"ContainerDied","Data":"71f4858d12f90ad9f0f1a4ba5973a33d9051633cd8c42a53e540516b1ae33414"} Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.557464 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f4858d12f90ad9f0f1a4ba5973a33d9051633cd8c42a53e540516b1ae33414" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.557513 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rk6x9" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.691638 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw"] Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.692479 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="extract-utilities" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.692546 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="extract-utilities" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.692638 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.692700 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.692789 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="extract-content" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.692846 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="extract-content" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.692949 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="extract-utilities" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.693006 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="extract-utilities" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.693064 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="extract-content" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.693131 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="extract-content" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.694681 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.694738 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: E1007 17:52:02.694812 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d237e9-d752-4244-8f32-be01a5ca3f6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.694875 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d237e9-d752-4244-8f32-be01a5ca3f6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.695139 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3b32a5-797b-4f04-bdcb-9976eddaeb48" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.695218 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d237e9-d752-4244-8f32-be01a5ca3f6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.695284 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="48717f89-70f6-4e1e-ba52-77ba39cd7929" containerName="registry-server" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.695957 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.704345 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.704466 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.704517 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.704428 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.705152 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw"] Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.706104 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2vtl6" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758384 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758434 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758548 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758575 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758728 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758797 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.758905 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrs4g\" (UniqueName: \"kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.860772 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrs4g\" (UniqueName: \"kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.860846 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.860870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.860968 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.860991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.861026 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.861049 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.864693 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.864944 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.865052 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.865539 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.868385 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.873623 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:02 crc kubenswrapper[4681]: I1007 17:52:02.879359 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrs4g\" (UniqueName: \"kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rbskw\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:03 crc kubenswrapper[4681]: I1007 17:52:03.026738 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:52:03 crc kubenswrapper[4681]: I1007 17:52:03.546562 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw"] Oct 07 17:52:03 crc kubenswrapper[4681]: I1007 17:52:03.584089 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:52:04 crc kubenswrapper[4681]: I1007 17:52:04.583367 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" event={"ID":"650f08d2-bbd6-4cf7-b8d1-5923a4075672","Type":"ContainerStarted","Data":"371c657f74eef40fc3ceca1a695e84a52d5bc0bdfb48d2589722df893fd48be4"} Oct 07 17:52:04 crc kubenswrapper[4681]: I1007 17:52:04.584285 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" event={"ID":"650f08d2-bbd6-4cf7-b8d1-5923a4075672","Type":"ContainerStarted","Data":"02d2abc38d43e47f9ad734c2a704ce651780e8bbf9f3c3cbce741ced088ddd60"} Oct 07 17:52:04 crc kubenswrapper[4681]: I1007 17:52:04.611673 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" podStartSLOduration=2.461199671 podStartE2EDuration="2.611658884s" podCreationTimestamp="2025-10-07 17:52:02 +0000 UTC" firstStartedPulling="2025-10-07 17:52:03.58381892 +0000 UTC m=+2927.231230475" lastFinishedPulling="2025-10-07 17:52:03.734278133 +0000 UTC m=+2927.381689688" observedRunningTime="2025-10-07 17:52:04.609036591 +0000 UTC m=+2928.256448156" watchObservedRunningTime="2025-10-07 17:52:04.611658884 +0000 UTC m=+2928.259070439" Oct 07 17:52:10 crc kubenswrapper[4681]: I1007 17:52:10.029728 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:52:10 crc kubenswrapper[4681]: E1007 17:52:10.030345 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:52:23 crc kubenswrapper[4681]: I1007 17:52:23.028916 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:52:23 crc kubenswrapper[4681]: E1007 17:52:23.029593 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.076477 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.085330 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.092066 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.163451 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.163811 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52h9t\" (UniqueName: \"kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.163859 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.265448 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.265603 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.265653 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52h9t\" (UniqueName: \"kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.265905 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.266216 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.285562 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52h9t\" (UniqueName: \"kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t\") pod \"redhat-marketplace-6wgpj\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.414993 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:29 crc kubenswrapper[4681]: I1007 17:52:29.915734 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:30 crc kubenswrapper[4681]: I1007 17:52:30.803758 4681 generic.go:334] "Generic (PLEG): container finished" podID="62a2688e-16c7-4431-a836-e8036707fa93" containerID="46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888" exitCode=0 Oct 07 17:52:30 crc kubenswrapper[4681]: I1007 17:52:30.803804 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerDied","Data":"46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888"} Oct 07 17:52:30 crc kubenswrapper[4681]: I1007 17:52:30.804727 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerStarted","Data":"e8eb8db696498b048bc870a31cabf606ee7701eda622768b06a1ac51884a056e"} Oct 07 17:52:32 crc kubenswrapper[4681]: I1007 17:52:32.825282 4681 generic.go:334] "Generic (PLEG): container finished" podID="62a2688e-16c7-4431-a836-e8036707fa93" containerID="fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876" exitCode=0 Oct 07 17:52:32 crc kubenswrapper[4681]: I1007 17:52:32.825368 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerDied","Data":"fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876"} Oct 07 17:52:33 crc kubenswrapper[4681]: I1007 17:52:33.837401 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerStarted","Data":"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e"} Oct 07 17:52:35 crc kubenswrapper[4681]: I1007 17:52:35.029500 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:52:35 crc kubenswrapper[4681]: E1007 17:52:35.029776 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.415870 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.416163 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.462842 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.492545 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6wgpj" podStartSLOduration=7.9134909140000005 podStartE2EDuration="10.492525842s" podCreationTimestamp="2025-10-07 17:52:29 +0000 UTC" firstStartedPulling="2025-10-07 17:52:30.80513221 +0000 UTC m=+2954.452543765" lastFinishedPulling="2025-10-07 17:52:33.384167138 +0000 UTC m=+2957.031578693" observedRunningTime="2025-10-07 17:52:34.893508598 +0000 UTC m=+2958.540920163" watchObservedRunningTime="2025-10-07 17:52:39.492525842 +0000 UTC m=+2963.139937407" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.938342 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:39 crc kubenswrapper[4681]: I1007 17:52:39.983834 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:41 crc kubenswrapper[4681]: I1007 17:52:41.904693 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6wgpj" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="registry-server" containerID="cri-o://8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e" gracePeriod=2 Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.383913 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.508546 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52h9t\" (UniqueName: \"kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t\") pod \"62a2688e-16c7-4431-a836-e8036707fa93\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.508988 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities\") pod \"62a2688e-16c7-4431-a836-e8036707fa93\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.509078 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content\") pod \"62a2688e-16c7-4431-a836-e8036707fa93\" (UID: \"62a2688e-16c7-4431-a836-e8036707fa93\") " Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.509769 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities" (OuterVolumeSpecName: "utilities") pod "62a2688e-16c7-4431-a836-e8036707fa93" (UID: "62a2688e-16c7-4431-a836-e8036707fa93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.515689 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t" (OuterVolumeSpecName: "kube-api-access-52h9t") pod "62a2688e-16c7-4431-a836-e8036707fa93" (UID: "62a2688e-16c7-4431-a836-e8036707fa93"). InnerVolumeSpecName "kube-api-access-52h9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.523033 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62a2688e-16c7-4431-a836-e8036707fa93" (UID: "62a2688e-16c7-4431-a836-e8036707fa93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.610933 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.610967 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a2688e-16c7-4431-a836-e8036707fa93-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.610979 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52h9t\" (UniqueName: \"kubernetes.io/projected/62a2688e-16c7-4431-a836-e8036707fa93-kube-api-access-52h9t\") on node \"crc\" DevicePath \"\"" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.914192 4681 generic.go:334] "Generic (PLEG): container finished" podID="62a2688e-16c7-4431-a836-e8036707fa93" containerID="8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e" exitCode=0 Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.914250 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerDied","Data":"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e"} Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.914277 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6wgpj" event={"ID":"62a2688e-16c7-4431-a836-e8036707fa93","Type":"ContainerDied","Data":"e8eb8db696498b048bc870a31cabf606ee7701eda622768b06a1ac51884a056e"} Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.914294 4681 scope.go:117] "RemoveContainer" containerID="8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.914451 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6wgpj" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.947802 4681 scope.go:117] "RemoveContainer" containerID="fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876" Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.956766 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.967263 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6wgpj"] Oct 07 17:52:42 crc kubenswrapper[4681]: I1007 17:52:42.982031 4681 scope.go:117] "RemoveContainer" containerID="46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.025366 4681 scope.go:117] "RemoveContainer" containerID="8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e" Oct 07 17:52:43 crc kubenswrapper[4681]: E1007 17:52:43.027076 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e\": container with ID starting with 8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e not found: ID does not exist" containerID="8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.027111 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e"} err="failed to get container status \"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e\": rpc error: code = NotFound desc = could not find container \"8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e\": container with ID starting with 8ba05902f4a5ae67e8aff82645d88683f515e8dd82b30f1782591df46129456e not found: ID does not exist" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.027133 4681 scope.go:117] "RemoveContainer" containerID="fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876" Oct 07 17:52:43 crc kubenswrapper[4681]: E1007 17:52:43.027372 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876\": container with ID starting with fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876 not found: ID does not exist" containerID="fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.027397 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876"} err="failed to get container status \"fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876\": rpc error: code = NotFound desc = could not find container \"fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876\": container with ID starting with fe145ef4847a24cecee1e905b3e4199637fe70f7059856d6ef8cb894eef13876 not found: ID does not exist" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.027415 4681 scope.go:117] "RemoveContainer" containerID="46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888" Oct 07 17:52:43 crc kubenswrapper[4681]: E1007 17:52:43.027679 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888\": container with ID starting with 46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888 not found: ID does not exist" containerID="46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.027698 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888"} err="failed to get container status \"46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888\": rpc error: code = NotFound desc = could not find container \"46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888\": container with ID starting with 46ad01df10758b360c107dddd00bec1b836cead8e11769068429c8593af3d888 not found: ID does not exist" Oct 07 17:52:43 crc kubenswrapper[4681]: I1007 17:52:43.051354 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a2688e-16c7-4431-a836-e8036707fa93" path="/var/lib/kubelet/pods/62a2688e-16c7-4431-a836-e8036707fa93/volumes" Oct 07 17:52:46 crc kubenswrapper[4681]: I1007 17:52:46.029348 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:52:46 crc kubenswrapper[4681]: E1007 17:52:46.029804 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:53:00 crc kubenswrapper[4681]: I1007 17:53:00.029917 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:53:00 crc kubenswrapper[4681]: E1007 17:53:00.030554 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:53:13 crc kubenswrapper[4681]: I1007 17:53:13.029578 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:53:13 crc kubenswrapper[4681]: E1007 17:53:13.031043 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:53:24 crc kubenswrapper[4681]: I1007 17:53:24.029102 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:53:24 crc kubenswrapper[4681]: E1007 17:53:24.029802 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:53:38 crc kubenswrapper[4681]: I1007 17:53:38.029852 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:53:38 crc kubenswrapper[4681]: E1007 17:53:38.030678 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:53:50 crc kubenswrapper[4681]: I1007 17:53:50.029220 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:53:50 crc kubenswrapper[4681]: E1007 17:53:50.030137 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:54:01 crc kubenswrapper[4681]: I1007 17:54:01.030214 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:54:01 crc kubenswrapper[4681]: E1007 17:54:01.031321 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:54:13 crc kubenswrapper[4681]: I1007 17:54:13.029369 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:54:13 crc kubenswrapper[4681]: E1007 17:54:13.031731 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:54:24 crc kubenswrapper[4681]: I1007 17:54:24.029653 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:54:24 crc kubenswrapper[4681]: E1007 17:54:24.030549 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:54:39 crc kubenswrapper[4681]: I1007 17:54:39.029672 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:54:39 crc kubenswrapper[4681]: E1007 17:54:39.030484 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:54:51 crc kubenswrapper[4681]: I1007 17:54:51.029137 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:54:51 crc kubenswrapper[4681]: E1007 17:54:51.029802 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:55:03 crc kubenswrapper[4681]: I1007 17:55:03.029189 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:55:03 crc kubenswrapper[4681]: E1007 17:55:03.030017 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:55:17 crc kubenswrapper[4681]: I1007 17:55:17.042963 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:55:17 crc kubenswrapper[4681]: E1007 17:55:17.043746 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.521440 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:26 crc kubenswrapper[4681]: E1007 17:55:26.525037 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="registry-server" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.525067 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="registry-server" Oct 07 17:55:26 crc kubenswrapper[4681]: E1007 17:55:26.525091 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="extract-content" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.525099 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="extract-content" Oct 07 17:55:26 crc kubenswrapper[4681]: E1007 17:55:26.525122 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="extract-utilities" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.525130 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="extract-utilities" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.525344 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a2688e-16c7-4431-a836-e8036707fa93" containerName="registry-server" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.527606 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.553172 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.609862 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwzr\" (UniqueName: \"kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.609930 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.610390 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.712680 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.712764 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwzr\" (UniqueName: \"kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.712808 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.713321 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.713506 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.733174 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwzr\" (UniqueName: \"kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr\") pod \"redhat-operators-xgb9z\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:26 crc kubenswrapper[4681]: I1007 17:55:26.853524 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:27 crc kubenswrapper[4681]: I1007 17:55:27.447247 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:28 crc kubenswrapper[4681]: I1007 17:55:28.469321 4681 generic.go:334] "Generic (PLEG): container finished" podID="94da300e-8abe-47a6-835b-af8711f5e03e" containerID="a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec" exitCode=0 Oct 07 17:55:28 crc kubenswrapper[4681]: I1007 17:55:28.469354 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerDied","Data":"a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec"} Oct 07 17:55:28 crc kubenswrapper[4681]: I1007 17:55:28.469800 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerStarted","Data":"1fa4b403b6bbc0e4955c126ac27fb6667673ae307aaed7000928825aa1dd3259"} Oct 07 17:55:30 crc kubenswrapper[4681]: I1007 17:55:30.029056 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:55:30 crc kubenswrapper[4681]: E1007 17:55:30.029608 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:55:30 crc kubenswrapper[4681]: I1007 17:55:30.489653 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerStarted","Data":"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8"} Oct 07 17:55:34 crc kubenswrapper[4681]: I1007 17:55:34.530011 4681 generic.go:334] "Generic (PLEG): container finished" podID="94da300e-8abe-47a6-835b-af8711f5e03e" containerID="725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8" exitCode=0 Oct 07 17:55:34 crc kubenswrapper[4681]: I1007 17:55:34.530092 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerDied","Data":"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8"} Oct 07 17:55:35 crc kubenswrapper[4681]: I1007 17:55:35.544039 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerStarted","Data":"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1"} Oct 07 17:55:35 crc kubenswrapper[4681]: I1007 17:55:35.566480 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xgb9z" podStartSLOduration=2.933861192 podStartE2EDuration="9.566463523s" podCreationTimestamp="2025-10-07 17:55:26 +0000 UTC" firstStartedPulling="2025-10-07 17:55:28.471941756 +0000 UTC m=+3132.119353311" lastFinishedPulling="2025-10-07 17:55:35.104544087 +0000 UTC m=+3138.751955642" observedRunningTime="2025-10-07 17:55:35.564451766 +0000 UTC m=+3139.211863341" watchObservedRunningTime="2025-10-07 17:55:35.566463523 +0000 UTC m=+3139.213875078" Oct 07 17:55:36 crc kubenswrapper[4681]: I1007 17:55:36.854713 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:36 crc kubenswrapper[4681]: I1007 17:55:36.855046 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:37 crc kubenswrapper[4681]: I1007 17:55:37.900078 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xgb9z" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="registry-server" probeResult="failure" output=< Oct 07 17:55:37 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 17:55:37 crc kubenswrapper[4681]: > Oct 07 17:55:45 crc kubenswrapper[4681]: I1007 17:55:45.029803 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:55:45 crc kubenswrapper[4681]: E1007 17:55:45.030588 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:55:46 crc kubenswrapper[4681]: I1007 17:55:46.903130 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:46 crc kubenswrapper[4681]: I1007 17:55:46.958223 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:47 crc kubenswrapper[4681]: I1007 17:55:47.142309 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:48 crc kubenswrapper[4681]: I1007 17:55:48.647104 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xgb9z" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="registry-server" containerID="cri-o://bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1" gracePeriod=2 Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.168503 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.351396 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities\") pod \"94da300e-8abe-47a6-835b-af8711f5e03e\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.351991 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdwzr\" (UniqueName: \"kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr\") pod \"94da300e-8abe-47a6-835b-af8711f5e03e\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.352330 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities" (OuterVolumeSpecName: "utilities") pod "94da300e-8abe-47a6-835b-af8711f5e03e" (UID: "94da300e-8abe-47a6-835b-af8711f5e03e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.352429 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content\") pod \"94da300e-8abe-47a6-835b-af8711f5e03e\" (UID: \"94da300e-8abe-47a6-835b-af8711f5e03e\") " Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.352917 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.387702 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr" (OuterVolumeSpecName: "kube-api-access-jdwzr") pod "94da300e-8abe-47a6-835b-af8711f5e03e" (UID: "94da300e-8abe-47a6-835b-af8711f5e03e"). InnerVolumeSpecName "kube-api-access-jdwzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.440570 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94da300e-8abe-47a6-835b-af8711f5e03e" (UID: "94da300e-8abe-47a6-835b-af8711f5e03e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.454977 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdwzr\" (UniqueName: \"kubernetes.io/projected/94da300e-8abe-47a6-835b-af8711f5e03e-kube-api-access-jdwzr\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.455006 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94da300e-8abe-47a6-835b-af8711f5e03e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.661244 4681 generic.go:334] "Generic (PLEG): container finished" podID="94da300e-8abe-47a6-835b-af8711f5e03e" containerID="bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1" exitCode=0 Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.661316 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerDied","Data":"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1"} Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.661358 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xgb9z" event={"ID":"94da300e-8abe-47a6-835b-af8711f5e03e","Type":"ContainerDied","Data":"1fa4b403b6bbc0e4955c126ac27fb6667673ae307aaed7000928825aa1dd3259"} Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.661388 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xgb9z" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.661396 4681 scope.go:117] "RemoveContainer" containerID="bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.698142 4681 scope.go:117] "RemoveContainer" containerID="725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.706460 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.725391 4681 scope.go:117] "RemoveContainer" containerID="a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.733007 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xgb9z"] Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.767743 4681 scope.go:117] "RemoveContainer" containerID="bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1" Oct 07 17:55:49 crc kubenswrapper[4681]: E1007 17:55:49.768205 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1\": container with ID starting with bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1 not found: ID does not exist" containerID="bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.768310 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1"} err="failed to get container status \"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1\": rpc error: code = NotFound desc = could not find container \"bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1\": container with ID starting with bf8368a374a80ed257d2c11590daf0457d990e57f9329b0e5759bf0c631062b1 not found: ID does not exist" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.768345 4681 scope.go:117] "RemoveContainer" containerID="725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8" Oct 07 17:55:49 crc kubenswrapper[4681]: E1007 17:55:49.768666 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8\": container with ID starting with 725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8 not found: ID does not exist" containerID="725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.768699 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8"} err="failed to get container status \"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8\": rpc error: code = NotFound desc = could not find container \"725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8\": container with ID starting with 725a114feab89c4809c889f49c57fbdb2eaa1dc93efcb69d900ff533d47097c8 not found: ID does not exist" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.768721 4681 scope.go:117] "RemoveContainer" containerID="a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec" Oct 07 17:55:49 crc kubenswrapper[4681]: E1007 17:55:49.769051 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec\": container with ID starting with a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec not found: ID does not exist" containerID="a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec" Oct 07 17:55:49 crc kubenswrapper[4681]: I1007 17:55:49.769187 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec"} err="failed to get container status \"a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec\": rpc error: code = NotFound desc = could not find container \"a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec\": container with ID starting with a8032f4193dddd43932aa6f87b9950b4c1dd84ae4b538af5fa1163f5edd773ec not found: ID does not exist" Oct 07 17:55:51 crc kubenswrapper[4681]: I1007 17:55:51.041551 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" path="/var/lib/kubelet/pods/94da300e-8abe-47a6-835b-af8711f5e03e/volumes" Oct 07 17:55:52 crc kubenswrapper[4681]: I1007 17:55:52.687123 4681 generic.go:334] "Generic (PLEG): container finished" podID="650f08d2-bbd6-4cf7-b8d1-5923a4075672" containerID="371c657f74eef40fc3ceca1a695e84a52d5bc0bdfb48d2589722df893fd48be4" exitCode=0 Oct 07 17:55:52 crc kubenswrapper[4681]: I1007 17:55:52.687210 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" event={"ID":"650f08d2-bbd6-4cf7-b8d1-5923a4075672","Type":"ContainerDied","Data":"371c657f74eef40fc3ceca1a695e84a52d5bc0bdfb48d2589722df893fd48be4"} Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.124192 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.260257 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrs4g\" (UniqueName: \"kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.260681 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.260903 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.261033 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.261168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.261316 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.261430 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2\") pod \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\" (UID: \"650f08d2-bbd6-4cf7-b8d1-5923a4075672\") " Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.266001 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.273097 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g" (OuterVolumeSpecName: "kube-api-access-hrs4g") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "kube-api-access-hrs4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.292502 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory" (OuterVolumeSpecName: "inventory") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.292869 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.297916 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.301432 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.320826 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "650f08d2-bbd6-4cf7-b8d1-5923a4075672" (UID: "650f08d2-bbd6-4cf7-b8d1-5923a4075672"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364478 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364520 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364537 4681 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364551 4681 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364564 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364576 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrs4g\" (UniqueName: \"kubernetes.io/projected/650f08d2-bbd6-4cf7-b8d1-5923a4075672-kube-api-access-hrs4g\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.364588 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/650f08d2-bbd6-4cf7-b8d1-5923a4075672-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.706844 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" event={"ID":"650f08d2-bbd6-4cf7-b8d1-5923a4075672","Type":"ContainerDied","Data":"02d2abc38d43e47f9ad734c2a704ce651780e8bbf9f3c3cbce741ced088ddd60"} Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.706900 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d2abc38d43e47f9ad734c2a704ce651780e8bbf9f3c3cbce741ced088ddd60" Oct 07 17:55:54 crc kubenswrapper[4681]: I1007 17:55:54.706943 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rbskw" Oct 07 17:56:00 crc kubenswrapper[4681]: I1007 17:56:00.029787 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:56:00 crc kubenswrapper[4681]: E1007 17:56:00.030570 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:56:13 crc kubenswrapper[4681]: I1007 17:56:13.030013 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:56:13 crc kubenswrapper[4681]: E1007 17:56:13.031011 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:56:28 crc kubenswrapper[4681]: I1007 17:56:28.030577 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:56:28 crc kubenswrapper[4681]: E1007 17:56:28.031483 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:56:42 crc kubenswrapper[4681]: I1007 17:56:42.028922 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:56:42 crc kubenswrapper[4681]: E1007 17:56:42.029604 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 17:56:53 crc kubenswrapper[4681]: I1007 17:56:53.030930 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.235179 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f"} Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.373323 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 17:56:54 crc kubenswrapper[4681]: E1007 17:56:54.374254 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="extract-content" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374270 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="extract-content" Oct 07 17:56:54 crc kubenswrapper[4681]: E1007 17:56:54.374293 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="extract-utilities" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374299 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="extract-utilities" Oct 07 17:56:54 crc kubenswrapper[4681]: E1007 17:56:54.374314 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f08d2-bbd6-4cf7-b8d1-5923a4075672" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374321 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f08d2-bbd6-4cf7-b8d1-5923a4075672" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 17:56:54 crc kubenswrapper[4681]: E1007 17:56:54.374343 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="registry-server" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374348 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="registry-server" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374530 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="94da300e-8abe-47a6-835b-af8711f5e03e" containerName="registry-server" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.374546 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f08d2-bbd6-4cf7-b8d1-5923a4075672" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.375268 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.377596 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.377600 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.377631 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.378212 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pcscr" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.382612 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.452473 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.452852 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453032 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453161 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453397 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453611 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453841 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.453983 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9z8\" (UniqueName: \"kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.454573 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557451 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557515 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9z8\" (UniqueName: \"kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557582 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557622 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557665 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557704 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557741 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557781 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.557862 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.558004 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.558510 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.558701 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.559005 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.559258 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.562526 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.562709 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.564615 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.577631 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9z8\" (UniqueName: \"kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.592413 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " pod="openstack/tempest-tests-tempest" Oct 07 17:56:54 crc kubenswrapper[4681]: I1007 17:56:54.698362 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 17:56:55 crc kubenswrapper[4681]: I1007 17:56:55.169365 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 17:56:55 crc kubenswrapper[4681]: I1007 17:56:55.244972 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"01a2ae55-90f7-432a-bc03-aedd6db91210","Type":"ContainerStarted","Data":"d4939819d786119146de3faa1cb9b44e957b0eb222e8e9b537137abbc2e5c787"} Oct 07 17:57:35 crc kubenswrapper[4681]: E1007 17:57:35.213855 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 07 17:57:35 crc kubenswrapper[4681]: E1007 17:57:35.214610 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kj9z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(01a2ae55-90f7-432a-bc03-aedd6db91210): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 17:57:35 crc kubenswrapper[4681]: E1007 17:57:35.215830 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="01a2ae55-90f7-432a-bc03-aedd6db91210" Oct 07 17:57:35 crc kubenswrapper[4681]: E1007 17:57:35.631527 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="01a2ae55-90f7-432a-bc03-aedd6db91210" Oct 07 17:57:50 crc kubenswrapper[4681]: I1007 17:57:50.031252 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 17:57:50 crc kubenswrapper[4681]: I1007 17:57:50.560357 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 17:57:52 crc kubenswrapper[4681]: I1007 17:57:52.806015 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"01a2ae55-90f7-432a-bc03-aedd6db91210","Type":"ContainerStarted","Data":"b371669858cd1272bb0ae822d5f02eae0321bac68b71c14ede2396c4b3df99c0"} Oct 07 17:57:52 crc kubenswrapper[4681]: I1007 17:57:52.828404 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.441543268 podStartE2EDuration="59.828389278s" podCreationTimestamp="2025-10-07 17:56:53 +0000 UTC" firstStartedPulling="2025-10-07 17:56:55.170482082 +0000 UTC m=+3218.817893637" lastFinishedPulling="2025-10-07 17:57:50.557328082 +0000 UTC m=+3274.204739647" observedRunningTime="2025-10-07 17:57:52.822964687 +0000 UTC m=+3276.470376242" watchObservedRunningTime="2025-10-07 17:57:52.828389278 +0000 UTC m=+3276.475800833" Oct 07 17:59:12 crc kubenswrapper[4681]: I1007 17:59:12.195753 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:59:12 crc kubenswrapper[4681]: I1007 17:59:12.196422 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:59:42 crc kubenswrapper[4681]: I1007 17:59:42.195017 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 17:59:42 crc kubenswrapper[4681]: I1007 17:59:42.195622 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.487119 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.490352 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.507439 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.561811 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.561884 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhvf\" (UniqueName: \"kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.561990 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.663551 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.663872 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhvf\" (UniqueName: \"kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.663999 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.664094 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.664416 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.688828 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhvf\" (UniqueName: \"kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf\") pod \"certified-operators-ksnlw\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:53 crc kubenswrapper[4681]: I1007 17:59:53.807051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 17:59:55 crc kubenswrapper[4681]: I1007 17:59:55.069764 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 17:59:55 crc kubenswrapper[4681]: I1007 17:59:55.998561 4681 generic.go:334] "Generic (PLEG): container finished" podID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerID="2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd" exitCode=0 Oct 07 17:59:55 crc kubenswrapper[4681]: I1007 17:59:55.998663 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerDied","Data":"2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd"} Oct 07 17:59:55 crc kubenswrapper[4681]: I1007 17:59:55.998830 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerStarted","Data":"ce6b9846eb898955c909997fe32c0edced9f78558e11ed6ba62f74edb70e06f1"} Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.671880 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.677593 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.686801 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.772660 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56svs\" (UniqueName: \"kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.772705 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.772789 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.873954 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.874104 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56svs\" (UniqueName: \"kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.874126 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.874474 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.874823 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:57 crc kubenswrapper[4681]: I1007 17:59:57.917665 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56svs\" (UniqueName: \"kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs\") pod \"community-operators-shcr2\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:58 crc kubenswrapper[4681]: I1007 17:59:58.001986 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shcr2" Oct 07 17:59:58 crc kubenswrapper[4681]: I1007 17:59:58.017954 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerStarted","Data":"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef"} Oct 07 17:59:58 crc kubenswrapper[4681]: I1007 17:59:58.569996 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 17:59:59 crc kubenswrapper[4681]: I1007 17:59:59.027163 4681 generic.go:334] "Generic (PLEG): container finished" podID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerID="4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef" exitCode=0 Oct 07 17:59:59 crc kubenswrapper[4681]: I1007 17:59:59.027211 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerDied","Data":"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef"} Oct 07 17:59:59 crc kubenswrapper[4681]: I1007 17:59:59.029068 4681 generic.go:334] "Generic (PLEG): container finished" podID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerID="a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d" exitCode=0 Oct 07 17:59:59 crc kubenswrapper[4681]: I1007 17:59:59.048135 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerDied","Data":"a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d"} Oct 07 17:59:59 crc kubenswrapper[4681]: I1007 17:59:59.048173 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerStarted","Data":"46ac7fc18b49e45c38c1f7a37a5b87ed465c9743aaeaddea15165f784dacad8a"} Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.040199 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerStarted","Data":"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175"} Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.043387 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerStarted","Data":"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4"} Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.080481 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ksnlw" podStartSLOduration=3.40272326 podStartE2EDuration="7.080460066s" podCreationTimestamp="2025-10-07 17:59:53 +0000 UTC" firstStartedPulling="2025-10-07 17:59:56.001062345 +0000 UTC m=+3399.648473900" lastFinishedPulling="2025-10-07 17:59:59.678799141 +0000 UTC m=+3403.326210706" observedRunningTime="2025-10-07 18:00:00.076338962 +0000 UTC m=+3403.723750507" watchObservedRunningTime="2025-10-07 18:00:00.080460066 +0000 UTC m=+3403.727871621" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.157560 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x"] Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.158993 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.161130 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.161491 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.177303 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x"] Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.323462 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.323531 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxz5d\" (UniqueName: \"kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.323613 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.425063 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.425590 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxz5d\" (UniqueName: \"kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.425765 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.425953 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.442234 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.443018 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxz5d\" (UniqueName: \"kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d\") pod \"collect-profiles-29331000-d827x\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:00 crc kubenswrapper[4681]: I1007 18:00:00.477480 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:01 crc kubenswrapper[4681]: I1007 18:00:01.007214 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x"] Oct 07 18:00:01 crc kubenswrapper[4681]: I1007 18:00:01.053523 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" event={"ID":"ef540e02-bee9-45df-9ad3-b5a8476f9e73","Type":"ContainerStarted","Data":"12e6235efcee17983986eccd2617f58b42c68ff6064ea9992443a05339bfa066"} Oct 07 18:00:02 crc kubenswrapper[4681]: I1007 18:00:02.063291 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" event={"ID":"ef540e02-bee9-45df-9ad3-b5a8476f9e73","Type":"ContainerStarted","Data":"d46399bc2ee56d4e9594dbfc4cdeb6c2485854927bfcc3e3c3f255af9732eb10"} Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.075822 4681 generic.go:334] "Generic (PLEG): container finished" podID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerID="9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175" exitCode=0 Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.075914 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerDied","Data":"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175"} Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.078351 4681 generic.go:334] "Generic (PLEG): container finished" podID="ef540e02-bee9-45df-9ad3-b5a8476f9e73" containerID="d46399bc2ee56d4e9594dbfc4cdeb6c2485854927bfcc3e3c3f255af9732eb10" exitCode=0 Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.078387 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" event={"ID":"ef540e02-bee9-45df-9ad3-b5a8476f9e73","Type":"ContainerDied","Data":"d46399bc2ee56d4e9594dbfc4cdeb6c2485854927bfcc3e3c3f255af9732eb10"} Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.106569 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" podStartSLOduration=3.106544798 podStartE2EDuration="3.106544798s" podCreationTimestamp="2025-10-07 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:00:02.092270852 +0000 UTC m=+3405.739682407" watchObservedRunningTime="2025-10-07 18:00:03.106544798 +0000 UTC m=+3406.753956363" Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.807990 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:03 crc kubenswrapper[4681]: I1007 18:00:03.808295 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.544328 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.629671 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume\") pod \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.631198 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxz5d\" (UniqueName: \"kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d\") pod \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.631502 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume\") pod \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\" (UID: \"ef540e02-bee9-45df-9ad3-b5a8476f9e73\") " Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.633863 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef540e02-bee9-45df-9ad3-b5a8476f9e73" (UID: "ef540e02-bee9-45df-9ad3-b5a8476f9e73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.637751 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef540e02-bee9-45df-9ad3-b5a8476f9e73" (UID: "ef540e02-bee9-45df-9ad3-b5a8476f9e73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.647006 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d" (OuterVolumeSpecName: "kube-api-access-sxz5d") pod "ef540e02-bee9-45df-9ad3-b5a8476f9e73" (UID: "ef540e02-bee9-45df-9ad3-b5a8476f9e73"). InnerVolumeSpecName "kube-api-access-sxz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.733336 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef540e02-bee9-45df-9ad3-b5a8476f9e73-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.733365 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef540e02-bee9-45df-9ad3-b5a8476f9e73-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.733375 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxz5d\" (UniqueName: \"kubernetes.io/projected/ef540e02-bee9-45df-9ad3-b5a8476f9e73-kube-api-access-sxz5d\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:04 crc kubenswrapper[4681]: I1007 18:00:04.859677 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ksnlw" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="registry-server" probeResult="failure" output=< Oct 07 18:00:04 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:00:04 crc kubenswrapper[4681]: > Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.097429 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" event={"ID":"ef540e02-bee9-45df-9ad3-b5a8476f9e73","Type":"ContainerDied","Data":"12e6235efcee17983986eccd2617f58b42c68ff6064ea9992443a05339bfa066"} Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.097755 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e6235efcee17983986eccd2617f58b42c68ff6064ea9992443a05339bfa066" Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.097819 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331000-d827x" Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.101761 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerStarted","Data":"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f"} Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.144403 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-shcr2" podStartSLOduration=3.021601741 podStartE2EDuration="8.144385776s" podCreationTimestamp="2025-10-07 17:59:57 +0000 UTC" firstStartedPulling="2025-10-07 17:59:59.030740736 +0000 UTC m=+3402.678152291" lastFinishedPulling="2025-10-07 18:00:04.153524771 +0000 UTC m=+3407.800936326" observedRunningTime="2025-10-07 18:00:05.141780574 +0000 UTC m=+3408.789192139" watchObservedRunningTime="2025-10-07 18:00:05.144385776 +0000 UTC m=+3408.791797331" Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.175358 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd"] Oct 07 18:00:05 crc kubenswrapper[4681]: I1007 18:00:05.182546 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330955-rkdgd"] Oct 07 18:00:07 crc kubenswrapper[4681]: I1007 18:00:07.042600 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9aa3d7c-f712-4749-a1d8-a9688c1c3d23" path="/var/lib/kubelet/pods/d9aa3d7c-f712-4749-a1d8-a9688c1c3d23/volumes" Oct 07 18:00:08 crc kubenswrapper[4681]: I1007 18:00:08.002206 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:08 crc kubenswrapper[4681]: I1007 18:00:08.002266 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:09 crc kubenswrapper[4681]: I1007 18:00:09.053301 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-shcr2" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="registry-server" probeResult="failure" output=< Oct 07 18:00:09 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:00:09 crc kubenswrapper[4681]: > Oct 07 18:00:12 crc kubenswrapper[4681]: I1007 18:00:12.196280 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:00:12 crc kubenswrapper[4681]: I1007 18:00:12.196569 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:00:12 crc kubenswrapper[4681]: I1007 18:00:12.196625 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:00:12 crc kubenswrapper[4681]: I1007 18:00:12.197528 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:00:12 crc kubenswrapper[4681]: I1007 18:00:12.197610 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f" gracePeriod=600 Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.181897 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f" exitCode=0 Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.181946 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f"} Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.182243 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d"} Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.182269 4681 scope.go:117] "RemoveContainer" containerID="6c7c236ed560c4b8a1b1ff4b4d13f7839110732ee87660e08f6e47191132ef9b" Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.860488 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:13 crc kubenswrapper[4681]: I1007 18:00:13.924993 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:14 crc kubenswrapper[4681]: I1007 18:00:14.094483 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 18:00:15 crc kubenswrapper[4681]: I1007 18:00:15.215895 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ksnlw" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="registry-server" containerID="cri-o://87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4" gracePeriod=2 Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.011874 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.139803 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddhvf\" (UniqueName: \"kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf\") pod \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.139929 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content\") pod \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.140132 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities\") pod \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\" (UID: \"5e8255c9-03f1-4bc6-a043-66d5f2078b2c\") " Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.140750 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities" (OuterVolumeSpecName: "utilities") pod "5e8255c9-03f1-4bc6-a043-66d5f2078b2c" (UID: "5e8255c9-03f1-4bc6-a043-66d5f2078b2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.152044 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf" (OuterVolumeSpecName: "kube-api-access-ddhvf") pod "5e8255c9-03f1-4bc6-a043-66d5f2078b2c" (UID: "5e8255c9-03f1-4bc6-a043-66d5f2078b2c"). InnerVolumeSpecName "kube-api-access-ddhvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.173580 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e8255c9-03f1-4bc6-a043-66d5f2078b2c" (UID: "5e8255c9-03f1-4bc6-a043-66d5f2078b2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.224556 4681 generic.go:334] "Generic (PLEG): container finished" podID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerID="87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4" exitCode=0 Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.224598 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerDied","Data":"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4"} Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.224624 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksnlw" event={"ID":"5e8255c9-03f1-4bc6-a043-66d5f2078b2c","Type":"ContainerDied","Data":"ce6b9846eb898955c909997fe32c0edced9f78558e11ed6ba62f74edb70e06f1"} Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.224640 4681 scope.go:117] "RemoveContainer" containerID="87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.224753 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksnlw" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.242194 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.242395 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddhvf\" (UniqueName: \"kubernetes.io/projected/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-kube-api-access-ddhvf\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.242467 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8255c9-03f1-4bc6-a043-66d5f2078b2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.247427 4681 scope.go:117] "RemoveContainer" containerID="4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.259958 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.268291 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ksnlw"] Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.280579 4681 scope.go:117] "RemoveContainer" containerID="2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.309625 4681 scope.go:117] "RemoveContainer" containerID="87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4" Oct 07 18:00:16 crc kubenswrapper[4681]: E1007 18:00:16.310068 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4\": container with ID starting with 87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4 not found: ID does not exist" containerID="87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.310100 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4"} err="failed to get container status \"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4\": rpc error: code = NotFound desc = could not find container \"87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4\": container with ID starting with 87147610e37be3aa18b4cd12eea8d1336ce8b571a29ef40d1cf4977bd32549e4 not found: ID does not exist" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.310121 4681 scope.go:117] "RemoveContainer" containerID="4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef" Oct 07 18:00:16 crc kubenswrapper[4681]: E1007 18:00:16.310444 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef\": container with ID starting with 4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef not found: ID does not exist" containerID="4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.310467 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef"} err="failed to get container status \"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef\": rpc error: code = NotFound desc = could not find container \"4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef\": container with ID starting with 4d7547e76558633c7aa11285d68349e39d75a658d09764d36c7c97a2a56d74ef not found: ID does not exist" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.310480 4681 scope.go:117] "RemoveContainer" containerID="2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd" Oct 07 18:00:16 crc kubenswrapper[4681]: E1007 18:00:16.310659 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd\": container with ID starting with 2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd not found: ID does not exist" containerID="2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd" Oct 07 18:00:16 crc kubenswrapper[4681]: I1007 18:00:16.310681 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd"} err="failed to get container status \"2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd\": rpc error: code = NotFound desc = could not find container \"2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd\": container with ID starting with 2be189d0512b41342b7dbf22093f9f867af613bf2165c1799a1e0e5b0b21bccd not found: ID does not exist" Oct 07 18:00:17 crc kubenswrapper[4681]: I1007 18:00:17.044388 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" path="/var/lib/kubelet/pods/5e8255c9-03f1-4bc6-a043-66d5f2078b2c/volumes" Oct 07 18:00:18 crc kubenswrapper[4681]: I1007 18:00:18.044967 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:18 crc kubenswrapper[4681]: I1007 18:00:18.098684 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:18 crc kubenswrapper[4681]: I1007 18:00:18.493360 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 18:00:19 crc kubenswrapper[4681]: I1007 18:00:19.251322 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-shcr2" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="registry-server" containerID="cri-o://9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f" gracePeriod=2 Oct 07 18:00:19 crc kubenswrapper[4681]: I1007 18:00:19.916953 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.015745 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56svs\" (UniqueName: \"kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs\") pod \"deccc418-4ef7-4682-8ba7-194e7767d05f\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.015984 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities\") pod \"deccc418-4ef7-4682-8ba7-194e7767d05f\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.016032 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content\") pod \"deccc418-4ef7-4682-8ba7-194e7767d05f\" (UID: \"deccc418-4ef7-4682-8ba7-194e7767d05f\") " Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.016644 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities" (OuterVolumeSpecName: "utilities") pod "deccc418-4ef7-4682-8ba7-194e7767d05f" (UID: "deccc418-4ef7-4682-8ba7-194e7767d05f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.023095 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs" (OuterVolumeSpecName: "kube-api-access-56svs") pod "deccc418-4ef7-4682-8ba7-194e7767d05f" (UID: "deccc418-4ef7-4682-8ba7-194e7767d05f"). InnerVolumeSpecName "kube-api-access-56svs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.079075 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deccc418-4ef7-4682-8ba7-194e7767d05f" (UID: "deccc418-4ef7-4682-8ba7-194e7767d05f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.119915 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.119944 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deccc418-4ef7-4682-8ba7-194e7767d05f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.119961 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56svs\" (UniqueName: \"kubernetes.io/projected/deccc418-4ef7-4682-8ba7-194e7767d05f-kube-api-access-56svs\") on node \"crc\" DevicePath \"\"" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.263545 4681 generic.go:334] "Generic (PLEG): container finished" podID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerID="9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f" exitCode=0 Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.263616 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerDied","Data":"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f"} Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.263649 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-shcr2" event={"ID":"deccc418-4ef7-4682-8ba7-194e7767d05f","Type":"ContainerDied","Data":"46ac7fc18b49e45c38c1f7a37a5b87ed465c9743aaeaddea15165f784dacad8a"} Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.263671 4681 scope.go:117] "RemoveContainer" containerID="9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.263619 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-shcr2" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.293760 4681 scope.go:117] "RemoveContainer" containerID="9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.318263 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.329262 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-shcr2"] Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.353419 4681 scope.go:117] "RemoveContainer" containerID="a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.375280 4681 scope.go:117] "RemoveContainer" containerID="9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f" Oct 07 18:00:20 crc kubenswrapper[4681]: E1007 18:00:20.376204 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f\": container with ID starting with 9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f not found: ID does not exist" containerID="9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.376243 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f"} err="failed to get container status \"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f\": rpc error: code = NotFound desc = could not find container \"9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f\": container with ID starting with 9516d83cb542a369e1e0aff4ccf2ed88bc21f7a06f9e4e1b7d36dd846c43711f not found: ID does not exist" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.376270 4681 scope.go:117] "RemoveContainer" containerID="9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175" Oct 07 18:00:20 crc kubenswrapper[4681]: E1007 18:00:20.376660 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175\": container with ID starting with 9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175 not found: ID does not exist" containerID="9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.376697 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175"} err="failed to get container status \"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175\": rpc error: code = NotFound desc = could not find container \"9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175\": container with ID starting with 9d5b7bec6848048df91560f580494eb0b8fd0abcd3d911e682e3caf23ef4c175 not found: ID does not exist" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.376714 4681 scope.go:117] "RemoveContainer" containerID="a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d" Oct 07 18:00:20 crc kubenswrapper[4681]: E1007 18:00:20.376984 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d\": container with ID starting with a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d not found: ID does not exist" containerID="a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d" Oct 07 18:00:20 crc kubenswrapper[4681]: I1007 18:00:20.377026 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d"} err="failed to get container status \"a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d\": rpc error: code = NotFound desc = could not find container \"a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d\": container with ID starting with a8d35224f7cf25f962cf654551392cc9d5a1f60b0e514a1d5e8792bae638f66d not found: ID does not exist" Oct 07 18:00:21 crc kubenswrapper[4681]: I1007 18:00:21.042173 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" path="/var/lib/kubelet/pods/deccc418-4ef7-4682-8ba7-194e7767d05f/volumes" Oct 07 18:00:36 crc kubenswrapper[4681]: I1007 18:00:36.714740 4681 scope.go:117] "RemoveContainer" containerID="f32f915c969fabc27d2e0413dd11d6d252033af1742c668dd39a3b0a31d59730" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.177722 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29331001-dj57c"] Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178574 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef540e02-bee9-45df-9ad3-b5a8476f9e73" containerName="collect-profiles" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178588 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef540e02-bee9-45df-9ad3-b5a8476f9e73" containerName="collect-profiles" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178610 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="extract-utilities" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178617 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="extract-utilities" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178632 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178638 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178651 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178657 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178670 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="extract-content" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178675 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="extract-content" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178687 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="extract-utilities" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178693 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="extract-utilities" Oct 07 18:01:00 crc kubenswrapper[4681]: E1007 18:01:00.178706 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="extract-content" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178711 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="extract-content" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178917 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8255c9-03f1-4bc6-a043-66d5f2078b2c" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178934 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="deccc418-4ef7-4682-8ba7-194e7767d05f" containerName="registry-server" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.178942 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef540e02-bee9-45df-9ad3-b5a8476f9e73" containerName="collect-profiles" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.179493 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.196422 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29331001-dj57c"] Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.337278 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.337370 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwdnh\" (UniqueName: \"kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.337399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.337456 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.439143 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwdnh\" (UniqueName: \"kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.439205 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.439274 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.439339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.446934 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.447160 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.449536 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.472738 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwdnh\" (UniqueName: \"kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh\") pod \"keystone-cron-29331001-dj57c\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:00 crc kubenswrapper[4681]: I1007 18:01:00.496577 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:01 crc kubenswrapper[4681]: I1007 18:01:01.005885 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29331001-dj57c"] Oct 07 18:01:01 crc kubenswrapper[4681]: I1007 18:01:01.621830 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331001-dj57c" event={"ID":"0205aec3-2b1b-427b-9359-40d4118c7f59","Type":"ContainerStarted","Data":"065d35c0f4bfde001ec0d2af06e20cea2daf48a17982d84c0c56dac152847d70"} Oct 07 18:01:01 crc kubenswrapper[4681]: I1007 18:01:01.622164 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331001-dj57c" event={"ID":"0205aec3-2b1b-427b-9359-40d4118c7f59","Type":"ContainerStarted","Data":"3dbf1cb1e1567095fe827b3f92c1316c16a6848c8faf48037c4345747c5fc9ab"} Oct 07 18:01:01 crc kubenswrapper[4681]: I1007 18:01:01.646689 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29331001-dj57c" podStartSLOduration=1.646672124 podStartE2EDuration="1.646672124s" podCreationTimestamp="2025-10-07 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:01:01.637430827 +0000 UTC m=+3465.284842392" watchObservedRunningTime="2025-10-07 18:01:01.646672124 +0000 UTC m=+3465.294083669" Oct 07 18:01:05 crc kubenswrapper[4681]: I1007 18:01:05.655931 4681 generic.go:334] "Generic (PLEG): container finished" podID="0205aec3-2b1b-427b-9359-40d4118c7f59" containerID="065d35c0f4bfde001ec0d2af06e20cea2daf48a17982d84c0c56dac152847d70" exitCode=0 Oct 07 18:01:05 crc kubenswrapper[4681]: I1007 18:01:05.656022 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331001-dj57c" event={"ID":"0205aec3-2b1b-427b-9359-40d4118c7f59","Type":"ContainerDied","Data":"065d35c0f4bfde001ec0d2af06e20cea2daf48a17982d84c0c56dac152847d70"} Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.321570 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.472038 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle\") pod \"0205aec3-2b1b-427b-9359-40d4118c7f59\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.472168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys\") pod \"0205aec3-2b1b-427b-9359-40d4118c7f59\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.472410 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwdnh\" (UniqueName: \"kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh\") pod \"0205aec3-2b1b-427b-9359-40d4118c7f59\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.472436 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data\") pod \"0205aec3-2b1b-427b-9359-40d4118c7f59\" (UID: \"0205aec3-2b1b-427b-9359-40d4118c7f59\") " Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.479733 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0205aec3-2b1b-427b-9359-40d4118c7f59" (UID: "0205aec3-2b1b-427b-9359-40d4118c7f59"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.481031 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh" (OuterVolumeSpecName: "kube-api-access-gwdnh") pod "0205aec3-2b1b-427b-9359-40d4118c7f59" (UID: "0205aec3-2b1b-427b-9359-40d4118c7f59"). InnerVolumeSpecName "kube-api-access-gwdnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.509041 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0205aec3-2b1b-427b-9359-40d4118c7f59" (UID: "0205aec3-2b1b-427b-9359-40d4118c7f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.560944 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data" (OuterVolumeSpecName: "config-data") pod "0205aec3-2b1b-427b-9359-40d4118c7f59" (UID: "0205aec3-2b1b-427b-9359-40d4118c7f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.574330 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwdnh\" (UniqueName: \"kubernetes.io/projected/0205aec3-2b1b-427b-9359-40d4118c7f59-kube-api-access-gwdnh\") on node \"crc\" DevicePath \"\"" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.574362 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.574371 4681 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.574381 4681 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0205aec3-2b1b-427b-9359-40d4118c7f59-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.674231 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29331001-dj57c" event={"ID":"0205aec3-2b1b-427b-9359-40d4118c7f59","Type":"ContainerDied","Data":"3dbf1cb1e1567095fe827b3f92c1316c16a6848c8faf48037c4345747c5fc9ab"} Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.674271 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dbf1cb1e1567095fe827b3f92c1316c16a6848c8faf48037c4345747c5fc9ab" Oct 07 18:01:07 crc kubenswrapper[4681]: I1007 18:01:07.674335 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29331001-dj57c" Oct 07 18:02:12 crc kubenswrapper[4681]: I1007 18:02:12.195336 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:02:12 crc kubenswrapper[4681]: I1007 18:02:12.195987 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:02:42 crc kubenswrapper[4681]: I1007 18:02:42.195771 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:02:42 crc kubenswrapper[4681]: I1007 18:02:42.196298 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.195524 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.196470 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.196603 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.197751 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.197820 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" gracePeriod=600 Oct 07 18:03:12 crc kubenswrapper[4681]: E1007 18:03:12.330232 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.908335 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" exitCode=0 Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.908409 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d"} Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.908473 4681 scope.go:117] "RemoveContainer" containerID="26299cac2d314ca6d5ea24a56aef772ad6bf58bfb220778407adf569e4845a4f" Oct 07 18:03:12 crc kubenswrapper[4681]: I1007 18:03:12.909219 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:03:12 crc kubenswrapper[4681]: E1007 18:03:12.913433 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:03:26 crc kubenswrapper[4681]: I1007 18:03:26.029859 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:03:26 crc kubenswrapper[4681]: E1007 18:03:26.030803 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:03:39 crc kubenswrapper[4681]: I1007 18:03:39.029632 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:03:39 crc kubenswrapper[4681]: E1007 18:03:39.030354 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.099934 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:03:49 crc kubenswrapper[4681]: E1007 18:03:49.100680 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0205aec3-2b1b-427b-9359-40d4118c7f59" containerName="keystone-cron" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.100693 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0205aec3-2b1b-427b-9359-40d4118c7f59" containerName="keystone-cron" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.106179 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0205aec3-2b1b-427b-9359-40d4118c7f59" containerName="keystone-cron" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.111320 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.114015 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.155867 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v7dn\" (UniqueName: \"kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.156215 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.156339 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.257966 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.258078 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v7dn\" (UniqueName: \"kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.258141 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.258503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.258536 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.292191 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v7dn\" (UniqueName: \"kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn\") pod \"redhat-marketplace-vblgc\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:49 crc kubenswrapper[4681]: I1007 18:03:49.443154 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:50 crc kubenswrapper[4681]: I1007 18:03:50.029529 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:03:50 crc kubenswrapper[4681]: E1007 18:03:50.030043 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:03:50 crc kubenswrapper[4681]: I1007 18:03:50.187139 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:03:50 crc kubenswrapper[4681]: I1007 18:03:50.225691 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerStarted","Data":"dc0ac07aa2ab8de549aca9c3c9d8f117cde8a8a07d253df96fa11005f3956a0d"} Oct 07 18:03:51 crc kubenswrapper[4681]: I1007 18:03:51.235327 4681 generic.go:334] "Generic (PLEG): container finished" podID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerID="caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297" exitCode=0 Oct 07 18:03:51 crc kubenswrapper[4681]: I1007 18:03:51.235379 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerDied","Data":"caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297"} Oct 07 18:03:51 crc kubenswrapper[4681]: I1007 18:03:51.239819 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 18:03:53 crc kubenswrapper[4681]: I1007 18:03:53.256109 4681 generic.go:334] "Generic (PLEG): container finished" podID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerID="b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a" exitCode=0 Oct 07 18:03:53 crc kubenswrapper[4681]: I1007 18:03:53.256202 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerDied","Data":"b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a"} Oct 07 18:03:54 crc kubenswrapper[4681]: I1007 18:03:54.267456 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerStarted","Data":"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb"} Oct 07 18:03:54 crc kubenswrapper[4681]: I1007 18:03:54.292420 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vblgc" podStartSLOduration=2.800001666 podStartE2EDuration="5.292401571s" podCreationTimestamp="2025-10-07 18:03:49 +0000 UTC" firstStartedPulling="2025-10-07 18:03:51.239083073 +0000 UTC m=+3634.886494628" lastFinishedPulling="2025-10-07 18:03:53.731482978 +0000 UTC m=+3637.378894533" observedRunningTime="2025-10-07 18:03:54.283608777 +0000 UTC m=+3637.931020332" watchObservedRunningTime="2025-10-07 18:03:54.292401571 +0000 UTC m=+3637.939813126" Oct 07 18:03:59 crc kubenswrapper[4681]: I1007 18:03:59.443370 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:59 crc kubenswrapper[4681]: I1007 18:03:59.443980 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:03:59 crc kubenswrapper[4681]: I1007 18:03:59.504651 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:04:00 crc kubenswrapper[4681]: I1007 18:04:00.371987 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:04:00 crc kubenswrapper[4681]: I1007 18:04:00.421522 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:04:02 crc kubenswrapper[4681]: I1007 18:04:02.341318 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vblgc" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="registry-server" containerID="cri-o://91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb" gracePeriod=2 Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.108753 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.140330 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities\") pod \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.140522 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content\") pod \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.140728 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v7dn\" (UniqueName: \"kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn\") pod \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\" (UID: \"93fb4eab-c635-4cb3-9377-3ff6957ebf83\") " Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.175182 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities" (OuterVolumeSpecName: "utilities") pod "93fb4eab-c635-4cb3-9377-3ff6957ebf83" (UID: "93fb4eab-c635-4cb3-9377-3ff6957ebf83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.175938 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn" (OuterVolumeSpecName: "kube-api-access-2v7dn") pod "93fb4eab-c635-4cb3-9377-3ff6957ebf83" (UID: "93fb4eab-c635-4cb3-9377-3ff6957ebf83"). InnerVolumeSpecName "kube-api-access-2v7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.191853 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93fb4eab-c635-4cb3-9377-3ff6957ebf83" (UID: "93fb4eab-c635-4cb3-9377-3ff6957ebf83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.245595 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v7dn\" (UniqueName: \"kubernetes.io/projected/93fb4eab-c635-4cb3-9377-3ff6957ebf83-kube-api-access-2v7dn\") on node \"crc\" DevicePath \"\"" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.245629 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.245638 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93fb4eab-c635-4cb3-9377-3ff6957ebf83-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.351399 4681 generic.go:334] "Generic (PLEG): container finished" podID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerID="91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb" exitCode=0 Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.351446 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerDied","Data":"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb"} Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.351474 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vblgc" event={"ID":"93fb4eab-c635-4cb3-9377-3ff6957ebf83","Type":"ContainerDied","Data":"dc0ac07aa2ab8de549aca9c3c9d8f117cde8a8a07d253df96fa11005f3956a0d"} Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.351492 4681 scope.go:117] "RemoveContainer" containerID="91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.351506 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vblgc" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.394968 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.397754 4681 scope.go:117] "RemoveContainer" containerID="b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.418092 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vblgc"] Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.423462 4681 scope.go:117] "RemoveContainer" containerID="caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.474856 4681 scope.go:117] "RemoveContainer" containerID="91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb" Oct 07 18:04:03 crc kubenswrapper[4681]: E1007 18:04:03.475363 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb\": container with ID starting with 91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb not found: ID does not exist" containerID="91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.475420 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb"} err="failed to get container status \"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb\": rpc error: code = NotFound desc = could not find container \"91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb\": container with ID starting with 91f23138999f69a4276d6103b81092772872e845dab4e069789151ed262003fb not found: ID does not exist" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.475445 4681 scope.go:117] "RemoveContainer" containerID="b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a" Oct 07 18:04:03 crc kubenswrapper[4681]: E1007 18:04:03.475928 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a\": container with ID starting with b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a not found: ID does not exist" containerID="b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.475976 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a"} err="failed to get container status \"b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a\": rpc error: code = NotFound desc = could not find container \"b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a\": container with ID starting with b04bcb6af55dd5354c8eb7f72343a0c77c669517b236f2e822b5cf8edbd34c7a not found: ID does not exist" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.475994 4681 scope.go:117] "RemoveContainer" containerID="caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297" Oct 07 18:04:03 crc kubenswrapper[4681]: E1007 18:04:03.476368 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297\": container with ID starting with caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297 not found: ID does not exist" containerID="caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297" Oct 07 18:04:03 crc kubenswrapper[4681]: I1007 18:04:03.476465 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297"} err="failed to get container status \"caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297\": rpc error: code = NotFound desc = could not find container \"caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297\": container with ID starting with caf3c1e9623dda18ac6ad76bb0f1dceef137b3ccb24272d1dea8d99805f76297 not found: ID does not exist" Oct 07 18:04:05 crc kubenswrapper[4681]: I1007 18:04:05.036024 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:04:05 crc kubenswrapper[4681]: E1007 18:04:05.036573 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:04:05 crc kubenswrapper[4681]: I1007 18:04:05.043858 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" path="/var/lib/kubelet/pods/93fb4eab-c635-4cb3-9377-3ff6957ebf83/volumes" Oct 07 18:04:19 crc kubenswrapper[4681]: I1007 18:04:19.034644 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:04:19 crc kubenswrapper[4681]: E1007 18:04:19.035339 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:04:30 crc kubenswrapper[4681]: I1007 18:04:30.029671 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:04:30 crc kubenswrapper[4681]: E1007 18:04:30.031511 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:04:41 crc kubenswrapper[4681]: I1007 18:04:41.029633 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:04:41 crc kubenswrapper[4681]: E1007 18:04:41.030339 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:04:53 crc kubenswrapper[4681]: I1007 18:04:53.029924 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:04:53 crc kubenswrapper[4681]: E1007 18:04:53.031157 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:05:06 crc kubenswrapper[4681]: I1007 18:05:06.029131 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:05:06 crc kubenswrapper[4681]: E1007 18:05:06.030069 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:05:17 crc kubenswrapper[4681]: I1007 18:05:17.051083 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:05:17 crc kubenswrapper[4681]: E1007 18:05:17.052307 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:05:32 crc kubenswrapper[4681]: I1007 18:05:32.029529 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:05:32 crc kubenswrapper[4681]: E1007 18:05:32.031281 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:05:47 crc kubenswrapper[4681]: I1007 18:05:47.028812 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:05:47 crc kubenswrapper[4681]: E1007 18:05:47.029642 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:05:59 crc kubenswrapper[4681]: I1007 18:05:59.034557 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:05:59 crc kubenswrapper[4681]: E1007 18:05:59.035291 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:06:13 crc kubenswrapper[4681]: I1007 18:06:13.029939 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:06:13 crc kubenswrapper[4681]: E1007 18:06:13.031546 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:06:27 crc kubenswrapper[4681]: I1007 18:06:27.052959 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:06:27 crc kubenswrapper[4681]: E1007 18:06:27.053981 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:06:42 crc kubenswrapper[4681]: I1007 18:06:42.029553 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:06:42 crc kubenswrapper[4681]: E1007 18:06:42.030309 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:06:54 crc kubenswrapper[4681]: I1007 18:06:54.028855 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:06:54 crc kubenswrapper[4681]: E1007 18:06:54.029699 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:07:07 crc kubenswrapper[4681]: I1007 18:07:07.034147 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:07:07 crc kubenswrapper[4681]: E1007 18:07:07.034986 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:07:18 crc kubenswrapper[4681]: I1007 18:07:18.029678 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:07:18 crc kubenswrapper[4681]: E1007 18:07:18.030456 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:07:31 crc kubenswrapper[4681]: I1007 18:07:31.029770 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:07:31 crc kubenswrapper[4681]: E1007 18:07:31.033028 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:07:45 crc kubenswrapper[4681]: I1007 18:07:45.029035 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:07:45 crc kubenswrapper[4681]: E1007 18:07:45.029601 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:07:57 crc kubenswrapper[4681]: I1007 18:07:57.036835 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:07:57 crc kubenswrapper[4681]: E1007 18:07:57.037471 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:08:10 crc kubenswrapper[4681]: I1007 18:08:10.029574 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:08:10 crc kubenswrapper[4681]: E1007 18:08:10.030210 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:08:24 crc kubenswrapper[4681]: I1007 18:08:24.029728 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:08:24 crc kubenswrapper[4681]: I1007 18:08:24.655759 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f"} Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.505203 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxw5l"] Oct 07 18:09:54 crc kubenswrapper[4681]: E1007 18:09:54.506472 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="extract-utilities" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.506522 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="extract-utilities" Oct 07 18:09:54 crc kubenswrapper[4681]: E1007 18:09:54.506533 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="extract-content" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.506543 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="extract-content" Oct 07 18:09:54 crc kubenswrapper[4681]: E1007 18:09:54.506589 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="registry-server" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.506600 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="registry-server" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.506847 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fb4eab-c635-4cb3-9377-3ff6957ebf83" containerName="registry-server" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.508895 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.513750 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxw5l"] Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.552354 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-utilities\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.552473 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkffg\" (UniqueName: \"kubernetes.io/projected/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-kube-api-access-fkffg\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.552587 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-catalog-content\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.654083 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-utilities\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.654179 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkffg\" (UniqueName: \"kubernetes.io/projected/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-kube-api-access-fkffg\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.654266 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-catalog-content\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.654691 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-catalog-content\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.654919 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-utilities\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.677842 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkffg\" (UniqueName: \"kubernetes.io/projected/7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d-kube-api-access-fkffg\") pod \"certified-operators-dxw5l\" (UID: \"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d\") " pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.695993 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.700182 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.710776 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.755584 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.755784 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x264s\" (UniqueName: \"kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.755826 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.836022 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.857940 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x264s\" (UniqueName: \"kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.858023 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.858091 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.858665 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.859086 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:54 crc kubenswrapper[4681]: I1007 18:09:54.876865 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x264s\" (UniqueName: \"kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s\") pod \"redhat-operators-tw4mx\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:55 crc kubenswrapper[4681]: I1007 18:09:55.069168 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:09:55 crc kubenswrapper[4681]: I1007 18:09:55.434863 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxw5l"] Oct 07 18:09:55 crc kubenswrapper[4681]: I1007 18:09:55.484832 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxw5l" event={"ID":"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d","Type":"ContainerStarted","Data":"3c99e5e2a78c39d0848e9895f29658248ed04e53f69f5ab05891e6a2a30407ae"} Oct 07 18:09:55 crc kubenswrapper[4681]: I1007 18:09:55.763391 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:09:55 crc kubenswrapper[4681]: W1007 18:09:55.770378 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07c5e64_700f_4b97_9030_4279f167c2f3.slice/crio-f85d9d1ee3f94bb741d519292fd6bc2dff6cc587ccb44a16add2c110b15ce90c WatchSource:0}: Error finding container f85d9d1ee3f94bb741d519292fd6bc2dff6cc587ccb44a16add2c110b15ce90c: Status 404 returned error can't find the container with id f85d9d1ee3f94bb741d519292fd6bc2dff6cc587ccb44a16add2c110b15ce90c Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.493943 4681 generic.go:334] "Generic (PLEG): container finished" podID="7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d" containerID="1246d725439734d3236286c4d67f9bbbcc7c66a2853ec0702a000466bcb21b40" exitCode=0 Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.493986 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxw5l" event={"ID":"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d","Type":"ContainerDied","Data":"1246d725439734d3236286c4d67f9bbbcc7c66a2853ec0702a000466bcb21b40"} Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.495830 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.497738 4681 generic.go:334] "Generic (PLEG): container finished" podID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerID="2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4" exitCode=0 Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.497779 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerDied","Data":"2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4"} Oct 07 18:09:56 crc kubenswrapper[4681]: I1007 18:09:56.497815 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerStarted","Data":"f85d9d1ee3f94bb741d519292fd6bc2dff6cc587ccb44a16add2c110b15ce90c"} Oct 07 18:09:58 crc kubenswrapper[4681]: I1007 18:09:58.529041 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerStarted","Data":"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc"} Oct 07 18:10:03 crc kubenswrapper[4681]: I1007 18:10:03.579931 4681 generic.go:334] "Generic (PLEG): container finished" podID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerID="043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc" exitCode=0 Oct 07 18:10:03 crc kubenswrapper[4681]: I1007 18:10:03.580013 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerDied","Data":"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc"} Oct 07 18:10:04 crc kubenswrapper[4681]: I1007 18:10:04.590700 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxw5l" event={"ID":"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d","Type":"ContainerStarted","Data":"d21fa020ea8f6bab75e903acfa6529461d48819ce74dd42679c192039d32bb67"} Oct 07 18:10:05 crc kubenswrapper[4681]: I1007 18:10:05.601220 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerStarted","Data":"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07"} Oct 07 18:10:05 crc kubenswrapper[4681]: I1007 18:10:05.602940 4681 generic.go:334] "Generic (PLEG): container finished" podID="7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d" containerID="d21fa020ea8f6bab75e903acfa6529461d48819ce74dd42679c192039d32bb67" exitCode=0 Oct 07 18:10:05 crc kubenswrapper[4681]: I1007 18:10:05.602970 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxw5l" event={"ID":"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d","Type":"ContainerDied","Data":"d21fa020ea8f6bab75e903acfa6529461d48819ce74dd42679c192039d32bb67"} Oct 07 18:10:05 crc kubenswrapper[4681]: I1007 18:10:05.628257 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tw4mx" podStartSLOduration=3.941259936 podStartE2EDuration="11.628237381s" podCreationTimestamp="2025-10-07 18:09:54 +0000 UTC" firstStartedPulling="2025-10-07 18:09:56.499865685 +0000 UTC m=+4000.147277240" lastFinishedPulling="2025-10-07 18:10:04.18684313 +0000 UTC m=+4007.834254685" observedRunningTime="2025-10-07 18:10:05.622871202 +0000 UTC m=+4009.270282757" watchObservedRunningTime="2025-10-07 18:10:05.628237381 +0000 UTC m=+4009.275648936" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.352323 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.355740 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.449659 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.518053 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.518112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.518164 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sghp\" (UniqueName: \"kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.619735 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sghp\" (UniqueName: \"kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.619916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.619953 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.620425 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.620563 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.648525 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sghp\" (UniqueName: \"kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp\") pod \"community-operators-ndv4t\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:11 crc kubenswrapper[4681]: I1007 18:10:11.676530 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:12 crc kubenswrapper[4681]: I1007 18:10:12.425653 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:12 crc kubenswrapper[4681]: I1007 18:10:12.684127 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxw5l" event={"ID":"7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d","Type":"ContainerStarted","Data":"a94c1151b6a2baa41d6c1a4dabfd5365dcd1e94c6bab0342fb39da48a471d717"} Oct 07 18:10:13 crc kubenswrapper[4681]: W1007 18:10:12.999809 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff46847c_55dc_432a_9149_4301d615579c.slice/crio-f162a9c9158b1d64d241cecca5cc1496d3f7d5302b28c0d2b3cdafaa74a88118 WatchSource:0}: Error finding container f162a9c9158b1d64d241cecca5cc1496d3f7d5302b28c0d2b3cdafaa74a88118: Status 404 returned error can't find the container with id f162a9c9158b1d64d241cecca5cc1496d3f7d5302b28c0d2b3cdafaa74a88118 Oct 07 18:10:13 crc kubenswrapper[4681]: I1007 18:10:13.694458 4681 generic.go:334] "Generic (PLEG): container finished" podID="ff46847c-55dc-432a-9149-4301d615579c" containerID="d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2" exitCode=0 Oct 07 18:10:13 crc kubenswrapper[4681]: I1007 18:10:13.694640 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerDied","Data":"d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2"} Oct 07 18:10:13 crc kubenswrapper[4681]: I1007 18:10:13.695035 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerStarted","Data":"f162a9c9158b1d64d241cecca5cc1496d3f7d5302b28c0d2b3cdafaa74a88118"} Oct 07 18:10:14 crc kubenswrapper[4681]: I1007 18:10:14.837167 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:10:14 crc kubenswrapper[4681]: I1007 18:10:14.837528 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:10:15 crc kubenswrapper[4681]: I1007 18:10:15.071794 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:15 crc kubenswrapper[4681]: I1007 18:10:15.071857 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:15 crc kubenswrapper[4681]: I1007 18:10:15.715014 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerStarted","Data":"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f"} Oct 07 18:10:15 crc kubenswrapper[4681]: I1007 18:10:15.734344 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxw5l" podStartSLOduration=6.5092217980000004 podStartE2EDuration="21.734320882s" podCreationTimestamp="2025-10-07 18:09:54 +0000 UTC" firstStartedPulling="2025-10-07 18:09:56.495581645 +0000 UTC m=+4000.142993200" lastFinishedPulling="2025-10-07 18:10:11.720680729 +0000 UTC m=+4015.368092284" observedRunningTime="2025-10-07 18:10:13.742252531 +0000 UTC m=+4017.389664086" watchObservedRunningTime="2025-10-07 18:10:15.734320882 +0000 UTC m=+4019.381732437" Oct 07 18:10:15 crc kubenswrapper[4681]: I1007 18:10:15.907189 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dxw5l" podUID="7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:15 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:15 crc kubenswrapper[4681]: > Oct 07 18:10:16 crc kubenswrapper[4681]: I1007 18:10:16.117798 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tw4mx" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:16 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:16 crc kubenswrapper[4681]: > Oct 07 18:10:17 crc kubenswrapper[4681]: I1007 18:10:17.738348 4681 generic.go:334] "Generic (PLEG): container finished" podID="ff46847c-55dc-432a-9149-4301d615579c" containerID="b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f" exitCode=0 Oct 07 18:10:17 crc kubenswrapper[4681]: I1007 18:10:17.738434 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerDied","Data":"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f"} Oct 07 18:10:18 crc kubenswrapper[4681]: I1007 18:10:18.749403 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerStarted","Data":"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894"} Oct 07 18:10:18 crc kubenswrapper[4681]: I1007 18:10:18.778022 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ndv4t" podStartSLOduration=3.265630212 podStartE2EDuration="7.777999739s" podCreationTimestamp="2025-10-07 18:10:11 +0000 UTC" firstStartedPulling="2025-10-07 18:10:13.696671624 +0000 UTC m=+4017.344083179" lastFinishedPulling="2025-10-07 18:10:18.209041161 +0000 UTC m=+4021.856452706" observedRunningTime="2025-10-07 18:10:18.77156487 +0000 UTC m=+4022.418976425" watchObservedRunningTime="2025-10-07 18:10:18.777999739 +0000 UTC m=+4022.425411294" Oct 07 18:10:21 crc kubenswrapper[4681]: I1007 18:10:21.677921 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:21 crc kubenswrapper[4681]: I1007 18:10:21.678424 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:22 crc kubenswrapper[4681]: I1007 18:10:22.723035 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ndv4t" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:22 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:22 crc kubenswrapper[4681]: > Oct 07 18:10:25 crc kubenswrapper[4681]: I1007 18:10:25.890795 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dxw5l" podUID="7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:25 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:25 crc kubenswrapper[4681]: > Oct 07 18:10:26 crc kubenswrapper[4681]: I1007 18:10:26.119789 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tw4mx" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:26 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:26 crc kubenswrapper[4681]: > Oct 07 18:10:31 crc kubenswrapper[4681]: I1007 18:10:31.726629 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:31 crc kubenswrapper[4681]: I1007 18:10:31.781741 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:31 crc kubenswrapper[4681]: I1007 18:10:31.972560 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:32 crc kubenswrapper[4681]: I1007 18:10:32.883084 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ndv4t" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="registry-server" containerID="cri-o://6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894" gracePeriod=2 Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.505217 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.651532 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities\") pod \"ff46847c-55dc-432a-9149-4301d615579c\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.651681 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sghp\" (UniqueName: \"kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp\") pod \"ff46847c-55dc-432a-9149-4301d615579c\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.651728 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content\") pod \"ff46847c-55dc-432a-9149-4301d615579c\" (UID: \"ff46847c-55dc-432a-9149-4301d615579c\") " Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.658309 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities" (OuterVolumeSpecName: "utilities") pod "ff46847c-55dc-432a-9149-4301d615579c" (UID: "ff46847c-55dc-432a-9149-4301d615579c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.672781 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp" (OuterVolumeSpecName: "kube-api-access-7sghp") pod "ff46847c-55dc-432a-9149-4301d615579c" (UID: "ff46847c-55dc-432a-9149-4301d615579c"). InnerVolumeSpecName "kube-api-access-7sghp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.703443 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff46847c-55dc-432a-9149-4301d615579c" (UID: "ff46847c-55dc-432a-9149-4301d615579c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.754216 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sghp\" (UniqueName: \"kubernetes.io/projected/ff46847c-55dc-432a-9149-4301d615579c-kube-api-access-7sghp\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.754255 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.754264 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff46847c-55dc-432a-9149-4301d615579c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.896647 4681 generic.go:334] "Generic (PLEG): container finished" podID="ff46847c-55dc-432a-9149-4301d615579c" containerID="6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894" exitCode=0 Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.896715 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerDied","Data":"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894"} Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.896755 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ndv4t" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.896807 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ndv4t" event={"ID":"ff46847c-55dc-432a-9149-4301d615579c","Type":"ContainerDied","Data":"f162a9c9158b1d64d241cecca5cc1496d3f7d5302b28c0d2b3cdafaa74a88118"} Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.896835 4681 scope.go:117] "RemoveContainer" containerID="6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.925460 4681 scope.go:117] "RemoveContainer" containerID="b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f" Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.935582 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:33 crc kubenswrapper[4681]: I1007 18:10:33.947017 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ndv4t"] Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.553391 4681 scope.go:117] "RemoveContainer" containerID="d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.590151 4681 scope.go:117] "RemoveContainer" containerID="6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894" Oct 07 18:10:34 crc kubenswrapper[4681]: E1007 18:10:34.594540 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894\": container with ID starting with 6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894 not found: ID does not exist" containerID="6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.594586 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894"} err="failed to get container status \"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894\": rpc error: code = NotFound desc = could not find container \"6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894\": container with ID starting with 6c333c89aea2ec36e23a472d387c992e63e3bc26c3b10deb2997e1c112ad4894 not found: ID does not exist" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.594615 4681 scope.go:117] "RemoveContainer" containerID="b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f" Oct 07 18:10:34 crc kubenswrapper[4681]: E1007 18:10:34.596567 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f\": container with ID starting with b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f not found: ID does not exist" containerID="b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.596608 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f"} err="failed to get container status \"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f\": rpc error: code = NotFound desc = could not find container \"b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f\": container with ID starting with b279c4dfd1992c949a9021969fd6c592bc08f2da2978a1e2eec6f4faac71ee5f not found: ID does not exist" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.596636 4681 scope.go:117] "RemoveContainer" containerID="d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2" Oct 07 18:10:34 crc kubenswrapper[4681]: E1007 18:10:34.603333 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2\": container with ID starting with d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2 not found: ID does not exist" containerID="d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.603392 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2"} err="failed to get container status \"d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2\": rpc error: code = NotFound desc = could not find container \"d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2\": container with ID starting with d16204901f7208b313d84dc16bc6c437ba3e6a9b4e4e259fc4a5292efacc89c2 not found: ID does not exist" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.885287 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:10:34 crc kubenswrapper[4681]: I1007 18:10:34.933642 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxw5l" Oct 07 18:10:35 crc kubenswrapper[4681]: I1007 18:10:35.040740 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff46847c-55dc-432a-9149-4301d615579c" path="/var/lib/kubelet/pods/ff46847c-55dc-432a-9149-4301d615579c/volumes" Oct 07 18:10:36 crc kubenswrapper[4681]: I1007 18:10:36.116721 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tw4mx" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" probeResult="failure" output=< Oct 07 18:10:36 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:10:36 crc kubenswrapper[4681]: > Oct 07 18:10:36 crc kubenswrapper[4681]: I1007 18:10:36.799499 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxw5l"] Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.176223 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.176770 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c95wr" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="registry-server" containerID="cri-o://5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" gracePeriod=2 Oct 07 18:10:37 crc kubenswrapper[4681]: E1007 18:10:37.724333 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 is running failed: container process not found" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 18:10:37 crc kubenswrapper[4681]: E1007 18:10:37.724868 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 is running failed: container process not found" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 18:10:37 crc kubenswrapper[4681]: E1007 18:10:37.725120 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 is running failed: container process not found" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" cmd=["grpc_health_probe","-addr=:50051"] Oct 07 18:10:37 crc kubenswrapper[4681]: E1007 18:10:37.725147 4681 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-c95wr" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="registry-server" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.749217 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.829229 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content\") pod \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.829442 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz54x\" (UniqueName: \"kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x\") pod \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.829512 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities\") pod \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\" (UID: \"c2743c88-7c95-463b-b5d3-4d183dd1e3e1\") " Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.834116 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities" (OuterVolumeSpecName: "utilities") pod "c2743c88-7c95-463b-b5d3-4d183dd1e3e1" (UID: "c2743c88-7c95-463b-b5d3-4d183dd1e3e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.841149 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x" (OuterVolumeSpecName: "kube-api-access-qz54x") pod "c2743c88-7c95-463b-b5d3-4d183dd1e3e1" (UID: "c2743c88-7c95-463b-b5d3-4d183dd1e3e1"). InnerVolumeSpecName "kube-api-access-qz54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.896006 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2743c88-7c95-463b-b5d3-4d183dd1e3e1" (UID: "c2743c88-7c95-463b-b5d3-4d183dd1e3e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.931393 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz54x\" (UniqueName: \"kubernetes.io/projected/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-kube-api-access-qz54x\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.931421 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.931432 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2743c88-7c95-463b-b5d3-4d183dd1e3e1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.939589 4681 generic.go:334] "Generic (PLEG): container finished" podID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" exitCode=0 Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.939770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerDied","Data":"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77"} Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.939851 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c95wr" event={"ID":"c2743c88-7c95-463b-b5d3-4d183dd1e3e1","Type":"ContainerDied","Data":"8c9b6ff65d60cad1554e754a14a051a34e30a530ac447551bcd0f178c0d291be"} Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.939914 4681 scope.go:117] "RemoveContainer" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.940038 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c95wr" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.969074 4681 scope.go:117] "RemoveContainer" containerID="9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425" Oct 07 18:10:37 crc kubenswrapper[4681]: I1007 18:10:37.983420 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.007589 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c95wr"] Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.000923 4681 scope.go:117] "RemoveContainer" containerID="5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.053072 4681 scope.go:117] "RemoveContainer" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" Oct 07 18:10:38 crc kubenswrapper[4681]: E1007 18:10:38.053574 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77\": container with ID starting with 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 not found: ID does not exist" containerID="5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.053615 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77"} err="failed to get container status \"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77\": rpc error: code = NotFound desc = could not find container \"5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77\": container with ID starting with 5d91b44754e132fd1f445d23de7b6f8f7a5e5f3f813a1fbefe55190e307b1c77 not found: ID does not exist" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.053643 4681 scope.go:117] "RemoveContainer" containerID="9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425" Oct 07 18:10:38 crc kubenswrapper[4681]: E1007 18:10:38.054103 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425\": container with ID starting with 9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425 not found: ID does not exist" containerID="9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.054138 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425"} err="failed to get container status \"9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425\": rpc error: code = NotFound desc = could not find container \"9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425\": container with ID starting with 9a852c6be82ad89fbe89d5053c4a0e4e5f9b311041d15f2ff56f260564104425 not found: ID does not exist" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.054158 4681 scope.go:117] "RemoveContainer" containerID="5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3" Oct 07 18:10:38 crc kubenswrapper[4681]: E1007 18:10:38.054489 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3\": container with ID starting with 5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3 not found: ID does not exist" containerID="5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3" Oct 07 18:10:38 crc kubenswrapper[4681]: I1007 18:10:38.054534 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3"} err="failed to get container status \"5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3\": rpc error: code = NotFound desc = could not find container \"5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3\": container with ID starting with 5cc7db48631c86adb6952fffd18a7b27714bd15f34c8fe5b05a700c56e4e4ac3 not found: ID does not exist" Oct 07 18:10:39 crc kubenswrapper[4681]: I1007 18:10:39.043235 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" path="/var/lib/kubelet/pods/c2743c88-7c95-463b-b5d3-4d183dd1e3e1/volumes" Oct 07 18:10:42 crc kubenswrapper[4681]: I1007 18:10:42.195073 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:10:42 crc kubenswrapper[4681]: I1007 18:10:42.195610 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:10:45 crc kubenswrapper[4681]: I1007 18:10:45.124852 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:45 crc kubenswrapper[4681]: I1007 18:10:45.174260 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:48 crc kubenswrapper[4681]: I1007 18:10:48.408904 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:10:48 crc kubenswrapper[4681]: I1007 18:10:48.409452 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tw4mx" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" containerID="cri-o://5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07" gracePeriod=2 Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.038515 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.050240 4681 generic.go:334] "Generic (PLEG): container finished" podID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerID="5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07" exitCode=0 Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.050280 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerDied","Data":"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07"} Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.050295 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tw4mx" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.050308 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tw4mx" event={"ID":"d07c5e64-700f-4b97-9030-4279f167c2f3","Type":"ContainerDied","Data":"f85d9d1ee3f94bb741d519292fd6bc2dff6cc587ccb44a16add2c110b15ce90c"} Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.050326 4681 scope.go:117] "RemoveContainer" containerID="5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.105080 4681 scope.go:117] "RemoveContainer" containerID="043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.128745 4681 scope.go:117] "RemoveContainer" containerID="2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.147837 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x264s\" (UniqueName: \"kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s\") pod \"d07c5e64-700f-4b97-9030-4279f167c2f3\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.147872 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities\") pod \"d07c5e64-700f-4b97-9030-4279f167c2f3\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.148056 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content\") pod \"d07c5e64-700f-4b97-9030-4279f167c2f3\" (UID: \"d07c5e64-700f-4b97-9030-4279f167c2f3\") " Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.150365 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities" (OuterVolumeSpecName: "utilities") pod "d07c5e64-700f-4b97-9030-4279f167c2f3" (UID: "d07c5e64-700f-4b97-9030-4279f167c2f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.169142 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s" (OuterVolumeSpecName: "kube-api-access-x264s") pod "d07c5e64-700f-4b97-9030-4279f167c2f3" (UID: "d07c5e64-700f-4b97-9030-4279f167c2f3"). InnerVolumeSpecName "kube-api-access-x264s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.189678 4681 scope.go:117] "RemoveContainer" containerID="5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07" Oct 07 18:10:49 crc kubenswrapper[4681]: E1007 18:10:49.192299 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07\": container with ID starting with 5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07 not found: ID does not exist" containerID="5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.192340 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07"} err="failed to get container status \"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07\": rpc error: code = NotFound desc = could not find container \"5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07\": container with ID starting with 5795e4937f72d149d9ea7601eb988dfd05776350e79cfeca4d640954b315ce07 not found: ID does not exist" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.192370 4681 scope.go:117] "RemoveContainer" containerID="043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc" Oct 07 18:10:49 crc kubenswrapper[4681]: E1007 18:10:49.192829 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc\": container with ID starting with 043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc not found: ID does not exist" containerID="043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.192916 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc"} err="failed to get container status \"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc\": rpc error: code = NotFound desc = could not find container \"043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc\": container with ID starting with 043ddb0b8e5911072113cb422310b0b1c149ab892f35db8520d06724efa3c8fc not found: ID does not exist" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.192952 4681 scope.go:117] "RemoveContainer" containerID="2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4" Oct 07 18:10:49 crc kubenswrapper[4681]: E1007 18:10:49.193552 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4\": container with ID starting with 2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4 not found: ID does not exist" containerID="2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.193579 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4"} err="failed to get container status \"2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4\": rpc error: code = NotFound desc = could not find container \"2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4\": container with ID starting with 2b72815f5901775bda2ac349e2dcd0b407a150f004a25502a59b66f531215df4 not found: ID does not exist" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.252347 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x264s\" (UniqueName: \"kubernetes.io/projected/d07c5e64-700f-4b97-9030-4279f167c2f3-kube-api-access-x264s\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.252378 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.311935 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d07c5e64-700f-4b97-9030-4279f167c2f3" (UID: "d07c5e64-700f-4b97-9030-4279f167c2f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.354485 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d07c5e64-700f-4b97-9030-4279f167c2f3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.379648 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:10:49 crc kubenswrapper[4681]: I1007 18:10:49.391318 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tw4mx"] Oct 07 18:10:51 crc kubenswrapper[4681]: I1007 18:10:51.041374 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" path="/var/lib/kubelet/pods/d07c5e64-700f-4b97-9030-4279f167c2f3/volumes" Oct 07 18:11:12 crc kubenswrapper[4681]: I1007 18:11:12.195294 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:11:12 crc kubenswrapper[4681]: I1007 18:11:12.195781 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.195369 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.195924 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.195971 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.196661 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.196792 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f" gracePeriod=600 Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.528244 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f" exitCode=0 Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.528327 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f"} Oct 07 18:11:42 crc kubenswrapper[4681]: I1007 18:11:42.528663 4681 scope.go:117] "RemoveContainer" containerID="abf23ded81b4f3c410cfe0c8cc3828f49c4c68610c679297c3102e73f59f238d" Oct 07 18:11:43 crc kubenswrapper[4681]: I1007 18:11:43.541991 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24"} Oct 07 18:13:42 crc kubenswrapper[4681]: I1007 18:13:42.195667 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:13:42 crc kubenswrapper[4681]: I1007 18:13:42.196194 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:14:12 crc kubenswrapper[4681]: I1007 18:14:12.196181 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:14:12 crc kubenswrapper[4681]: I1007 18:14:12.196946 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:14:42 crc kubenswrapper[4681]: I1007 18:14:42.194966 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:14:42 crc kubenswrapper[4681]: I1007 18:14:42.195392 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:14:42 crc kubenswrapper[4681]: I1007 18:14:42.195655 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:14:42 crc kubenswrapper[4681]: I1007 18:14:42.196418 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:14:42 crc kubenswrapper[4681]: I1007 18:14:42.196478 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" gracePeriod=600 Oct 07 18:14:42 crc kubenswrapper[4681]: E1007 18:14:42.318698 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:14:43 crc kubenswrapper[4681]: I1007 18:14:43.101629 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" exitCode=0 Oct 07 18:14:43 crc kubenswrapper[4681]: I1007 18:14:43.101686 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24"} Oct 07 18:14:43 crc kubenswrapper[4681]: I1007 18:14:43.102000 4681 scope.go:117] "RemoveContainer" containerID="aa28d70bece5e9925b181cd804253b1ca6cd72ebb0c8f2a332a9a3e419d4032f" Oct 07 18:14:43 crc kubenswrapper[4681]: I1007 18:14:43.102868 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:14:43 crc kubenswrapper[4681]: E1007 18:14:43.103368 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405120 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405518 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405531 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405555 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405560 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405566 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405573 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405581 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405587 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="extract-content" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405618 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405624 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405639 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405646 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405655 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405660 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="extract-utilities" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405674 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405680 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: E1007 18:14:44.405704 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405710 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405893 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07c5e64-700f-4b97-9030-4279f167c2f3" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405926 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff46847c-55dc-432a-9149-4301d615579c" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.405944 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2743c88-7c95-463b-b5d3-4d183dd1e3e1" containerName="registry-server" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.407438 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.420585 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.553797 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rc64\" (UniqueName: \"kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.553871 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.554003 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.655748 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rc64\" (UniqueName: \"kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.655895 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.655923 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.656475 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.656661 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.676529 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rc64\" (UniqueName: \"kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64\") pod \"redhat-marketplace-pvvxr\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:44 crc kubenswrapper[4681]: I1007 18:14:44.727126 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:45 crc kubenswrapper[4681]: I1007 18:14:45.190130 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:45 crc kubenswrapper[4681]: E1007 18:14:45.546081 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e89efaf_5d17_485a_9bed_85badd735558.slice/crio-conmon-6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c.scope\": RecentStats: unable to find data in memory cache]" Oct 07 18:14:46 crc kubenswrapper[4681]: I1007 18:14:46.164320 4681 generic.go:334] "Generic (PLEG): container finished" podID="7e89efaf-5d17-485a-9bed-85badd735558" containerID="6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c" exitCode=0 Oct 07 18:14:46 crc kubenswrapper[4681]: I1007 18:14:46.164361 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerDied","Data":"6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c"} Oct 07 18:14:46 crc kubenswrapper[4681]: I1007 18:14:46.164635 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerStarted","Data":"997cf723ac5f36e708c5c37b38b1c9c5c0a282c4ece8ffee75d48fcdf981ceac"} Oct 07 18:14:48 crc kubenswrapper[4681]: I1007 18:14:48.191632 4681 generic.go:334] "Generic (PLEG): container finished" podID="7e89efaf-5d17-485a-9bed-85badd735558" containerID="62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed" exitCode=0 Oct 07 18:14:48 crc kubenswrapper[4681]: I1007 18:14:48.192183 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerDied","Data":"62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed"} Oct 07 18:14:50 crc kubenswrapper[4681]: I1007 18:14:50.214068 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerStarted","Data":"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30"} Oct 07 18:14:50 crc kubenswrapper[4681]: I1007 18:14:50.239917 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvvxr" podStartSLOduration=3.61194376 podStartE2EDuration="6.23989919s" podCreationTimestamp="2025-10-07 18:14:44 +0000 UTC" firstStartedPulling="2025-10-07 18:14:46.166391392 +0000 UTC m=+4289.813802947" lastFinishedPulling="2025-10-07 18:14:48.794346822 +0000 UTC m=+4292.441758377" observedRunningTime="2025-10-07 18:14:50.232699899 +0000 UTC m=+4293.880111454" watchObservedRunningTime="2025-10-07 18:14:50.23989919 +0000 UTC m=+4293.887310745" Oct 07 18:14:54 crc kubenswrapper[4681]: I1007 18:14:54.029598 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:14:54 crc kubenswrapper[4681]: E1007 18:14:54.030076 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:14:54 crc kubenswrapper[4681]: I1007 18:14:54.728929 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:54 crc kubenswrapper[4681]: I1007 18:14:54.729249 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:54 crc kubenswrapper[4681]: I1007 18:14:54.776202 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:55 crc kubenswrapper[4681]: I1007 18:14:55.306673 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:55 crc kubenswrapper[4681]: I1007 18:14:55.363709 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.272124 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvvxr" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="registry-server" containerID="cri-o://179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30" gracePeriod=2 Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.767227 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.918479 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rc64\" (UniqueName: \"kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64\") pod \"7e89efaf-5d17-485a-9bed-85badd735558\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.918542 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content\") pod \"7e89efaf-5d17-485a-9bed-85badd735558\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.918623 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities\") pod \"7e89efaf-5d17-485a-9bed-85badd735558\" (UID: \"7e89efaf-5d17-485a-9bed-85badd735558\") " Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.920036 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities" (OuterVolumeSpecName: "utilities") pod "7e89efaf-5d17-485a-9bed-85badd735558" (UID: "7e89efaf-5d17-485a-9bed-85badd735558"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.933427 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e89efaf-5d17-485a-9bed-85badd735558" (UID: "7e89efaf-5d17-485a-9bed-85badd735558"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:14:57 crc kubenswrapper[4681]: I1007 18:14:57.943939 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64" (OuterVolumeSpecName: "kube-api-access-4rc64") pod "7e89efaf-5d17-485a-9bed-85badd735558" (UID: "7e89efaf-5d17-485a-9bed-85badd735558"). InnerVolumeSpecName "kube-api-access-4rc64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.021001 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rc64\" (UniqueName: \"kubernetes.io/projected/7e89efaf-5d17-485a-9bed-85badd735558-kube-api-access-4rc64\") on node \"crc\" DevicePath \"\"" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.021038 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.021051 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e89efaf-5d17-485a-9bed-85badd735558-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.282073 4681 generic.go:334] "Generic (PLEG): container finished" podID="7e89efaf-5d17-485a-9bed-85badd735558" containerID="179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30" exitCode=0 Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.282116 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerDied","Data":"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30"} Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.282141 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvvxr" event={"ID":"7e89efaf-5d17-485a-9bed-85badd735558","Type":"ContainerDied","Data":"997cf723ac5f36e708c5c37b38b1c9c5c0a282c4ece8ffee75d48fcdf981ceac"} Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.282157 4681 scope.go:117] "RemoveContainer" containerID="179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.282268 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvvxr" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.310937 4681 scope.go:117] "RemoveContainer" containerID="62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.316528 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.332438 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvvxr"] Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.341548 4681 scope.go:117] "RemoveContainer" containerID="6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.377613 4681 scope.go:117] "RemoveContainer" containerID="179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30" Oct 07 18:14:58 crc kubenswrapper[4681]: E1007 18:14:58.378168 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30\": container with ID starting with 179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30 not found: ID does not exist" containerID="179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.378212 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30"} err="failed to get container status \"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30\": rpc error: code = NotFound desc = could not find container \"179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30\": container with ID starting with 179f4af5b26bf715bfa699a36eb7d745d135c5436749e83592731d5c1186ba30 not found: ID does not exist" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.378239 4681 scope.go:117] "RemoveContainer" containerID="62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed" Oct 07 18:14:58 crc kubenswrapper[4681]: E1007 18:14:58.378674 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed\": container with ID starting with 62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed not found: ID does not exist" containerID="62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.378704 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed"} err="failed to get container status \"62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed\": rpc error: code = NotFound desc = could not find container \"62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed\": container with ID starting with 62e58833dec0f8cd2fde1f5f18c8fc4580719504b729095f2bd6407fb53aa3ed not found: ID does not exist" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.378725 4681 scope.go:117] "RemoveContainer" containerID="6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c" Oct 07 18:14:58 crc kubenswrapper[4681]: E1007 18:14:58.379023 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c\": container with ID starting with 6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c not found: ID does not exist" containerID="6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c" Oct 07 18:14:58 crc kubenswrapper[4681]: I1007 18:14:58.379055 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c"} err="failed to get container status \"6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c\": rpc error: code = NotFound desc = could not find container \"6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c\": container with ID starting with 6f166db6eeafa2d76ff9bf6808472e631c5bde282bdd43398f35e9453849284c not found: ID does not exist" Oct 07 18:14:59 crc kubenswrapper[4681]: I1007 18:14:59.042244 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e89efaf-5d17-485a-9bed-85badd735558" path="/var/lib/kubelet/pods/7e89efaf-5d17-485a-9bed-85badd735558/volumes" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.137792 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf"] Oct 07 18:15:00 crc kubenswrapper[4681]: E1007 18:15:00.138425 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="extract-utilities" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.138439 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="extract-utilities" Oct 07 18:15:00 crc kubenswrapper[4681]: E1007 18:15:00.138458 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="extract-content" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.138464 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="extract-content" Oct 07 18:15:00 crc kubenswrapper[4681]: E1007 18:15:00.138482 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="registry-server" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.138489 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="registry-server" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.138663 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e89efaf-5d17-485a-9bed-85badd735558" containerName="registry-server" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.139301 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.141230 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.146945 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.152525 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf"] Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.263129 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.263286 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvz4\" (UniqueName: \"kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.263314 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.365099 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.365221 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvz4\" (UniqueName: \"kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.365244 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.367139 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.493017 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.497843 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvz4\" (UniqueName: \"kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4\") pod \"collect-profiles-29331015-49dcf\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:00 crc kubenswrapper[4681]: I1007 18:15:00.783534 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:01 crc kubenswrapper[4681]: I1007 18:15:01.311249 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf"] Oct 07 18:15:02 crc kubenswrapper[4681]: I1007 18:15:02.318715 4681 generic.go:334] "Generic (PLEG): container finished" podID="eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" containerID="b1cda1e6e4640b371a726f4d44c588880bd0f2749e5fa9de06e0954cd9c9caee" exitCode=0 Oct 07 18:15:02 crc kubenswrapper[4681]: I1007 18:15:02.319122 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" event={"ID":"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1","Type":"ContainerDied","Data":"b1cda1e6e4640b371a726f4d44c588880bd0f2749e5fa9de06e0954cd9c9caee"} Oct 07 18:15:02 crc kubenswrapper[4681]: I1007 18:15:02.319188 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" event={"ID":"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1","Type":"ContainerStarted","Data":"57e1490716b024d0c2cbea6409f554f4385ccddab2f5cf6336b881f9b72daef0"} Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.676162 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.842638 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvz4\" (UniqueName: \"kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4\") pod \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.842953 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume\") pod \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.843196 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume\") pod \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\" (UID: \"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1\") " Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.843738 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" (UID: "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.844181 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.852718 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" (UID: "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.863597 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4" (OuterVolumeSpecName: "kube-api-access-wkvz4") pod "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" (UID: "eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1"). InnerVolumeSpecName "kube-api-access-wkvz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.947743 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvz4\" (UniqueName: \"kubernetes.io/projected/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-kube-api-access-wkvz4\") on node \"crc\" DevicePath \"\"" Oct 07 18:15:03 crc kubenswrapper[4681]: I1007 18:15:03.947810 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:15:04 crc kubenswrapper[4681]: I1007 18:15:04.340601 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" event={"ID":"eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1","Type":"ContainerDied","Data":"57e1490716b024d0c2cbea6409f554f4385ccddab2f5cf6336b881f9b72daef0"} Oct 07 18:15:04 crc kubenswrapper[4681]: I1007 18:15:04.340653 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57e1490716b024d0c2cbea6409f554f4385ccddab2f5cf6336b881f9b72daef0" Oct 07 18:15:04 crc kubenswrapper[4681]: I1007 18:15:04.340929 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331015-49dcf" Oct 07 18:15:04 crc kubenswrapper[4681]: I1007 18:15:04.756503 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w"] Oct 07 18:15:04 crc kubenswrapper[4681]: I1007 18:15:04.768327 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330970-dk76w"] Oct 07 18:15:05 crc kubenswrapper[4681]: I1007 18:15:05.041102 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3af6da-74ce-4d0c-a479-593e951996b2" path="/var/lib/kubelet/pods/1b3af6da-74ce-4d0c-a479-593e951996b2/volumes" Oct 07 18:15:07 crc kubenswrapper[4681]: I1007 18:15:07.035062 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:15:07 crc kubenswrapper[4681]: E1007 18:15:07.035540 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:15:19 crc kubenswrapper[4681]: I1007 18:15:19.029138 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:15:19 crc kubenswrapper[4681]: E1007 18:15:19.029896 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:15:34 crc kubenswrapper[4681]: I1007 18:15:34.029515 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:15:34 crc kubenswrapper[4681]: E1007 18:15:34.030189 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:15:37 crc kubenswrapper[4681]: I1007 18:15:37.156074 4681 scope.go:117] "RemoveContainer" containerID="16d9e238168403139820b21ab6d377cfaf8e6a5ee55dcc80f83f6f4c01c7a1d3" Oct 07 18:15:45 crc kubenswrapper[4681]: I1007 18:15:45.029871 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:15:45 crc kubenswrapper[4681]: E1007 18:15:45.031453 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:15:56 crc kubenswrapper[4681]: I1007 18:15:56.029534 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:15:56 crc kubenswrapper[4681]: E1007 18:15:56.030243 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:16:10 crc kubenswrapper[4681]: I1007 18:16:10.029734 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:16:10 crc kubenswrapper[4681]: E1007 18:16:10.030429 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:16:24 crc kubenswrapper[4681]: I1007 18:16:24.029808 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:16:24 crc kubenswrapper[4681]: E1007 18:16:24.030642 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:16:39 crc kubenswrapper[4681]: I1007 18:16:39.029676 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:16:39 crc kubenswrapper[4681]: E1007 18:16:39.030424 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:16:52 crc kubenswrapper[4681]: I1007 18:16:52.029521 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:16:52 crc kubenswrapper[4681]: E1007 18:16:52.030191 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:17:03 crc kubenswrapper[4681]: I1007 18:17:03.029211 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:17:03 crc kubenswrapper[4681]: E1007 18:17:03.029987 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:17:16 crc kubenswrapper[4681]: I1007 18:17:16.029713 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:17:16 crc kubenswrapper[4681]: E1007 18:17:16.031619 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:17:30 crc kubenswrapper[4681]: I1007 18:17:30.032116 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:17:30 crc kubenswrapper[4681]: E1007 18:17:30.033452 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:17:44 crc kubenswrapper[4681]: I1007 18:17:44.029019 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:17:44 crc kubenswrapper[4681]: E1007 18:17:44.029769 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:17:56 crc kubenswrapper[4681]: I1007 18:17:56.029122 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:17:56 crc kubenswrapper[4681]: E1007 18:17:56.030115 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:18:07 crc kubenswrapper[4681]: I1007 18:18:07.035219 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:18:07 crc kubenswrapper[4681]: E1007 18:18:07.036127 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:18:15 crc kubenswrapper[4681]: I1007 18:18:15.864412 4681 generic.go:334] "Generic (PLEG): container finished" podID="01a2ae55-90f7-432a-bc03-aedd6db91210" containerID="b371669858cd1272bb0ae822d5f02eae0321bac68b71c14ede2396c4b3df99c0" exitCode=0 Oct 07 18:18:15 crc kubenswrapper[4681]: I1007 18:18:15.864574 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"01a2ae55-90f7-432a-bc03-aedd6db91210","Type":"ContainerDied","Data":"b371669858cd1272bb0ae822d5f02eae0321bac68b71c14ede2396c4b3df99c0"} Oct 07 18:18:17 crc kubenswrapper[4681]: E1007 18:18:17.031771 4681 info.go:109] Failed to get network devices: open /sys/class/net/d4939819d786119/address: no such file or directory Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.226639 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396027 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396095 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396211 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396236 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396265 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396309 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396362 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj9z8\" (UniqueName: \"kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396434 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.396497 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"01a2ae55-90f7-432a-bc03-aedd6db91210\" (UID: \"01a2ae55-90f7-432a-bc03-aedd6db91210\") " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.397450 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.397539 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data" (OuterVolumeSpecName: "config-data") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.402081 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.402319 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.402639 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8" (OuterVolumeSpecName: "kube-api-access-kj9z8") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "kube-api-access-kj9z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.423912 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.427748 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.428184 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.446833 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "01a2ae55-90f7-432a-bc03-aedd6db91210" (UID: "01a2ae55-90f7-432a-bc03-aedd6db91210"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499143 4681 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499534 4681 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499637 4681 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499724 4681 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/01a2ae55-90f7-432a-bc03-aedd6db91210-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499806 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj9z8\" (UniqueName: \"kubernetes.io/projected/01a2ae55-90f7-432a-bc03-aedd6db91210-kube-api-access-kj9z8\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.499947 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.501810 4681 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.501923 4681 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01a2ae55-90f7-432a-bc03-aedd6db91210-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.502005 4681 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/01a2ae55-90f7-432a-bc03-aedd6db91210-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.523949 4681 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.603537 4681 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.884129 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"01a2ae55-90f7-432a-bc03-aedd6db91210","Type":"ContainerDied","Data":"d4939819d786119146de3faa1cb9b44e957b0eb222e8e9b537137abbc2e5c787"} Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.884441 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4939819d786119146de3faa1cb9b44e957b0eb222e8e9b537137abbc2e5c787" Oct 07 18:18:17 crc kubenswrapper[4681]: I1007 18:18:17.884252 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 18:18:20 crc kubenswrapper[4681]: I1007 18:18:20.030028 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:18:20 crc kubenswrapper[4681]: E1007 18:18:20.030569 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.287455 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 18:18:23 crc kubenswrapper[4681]: E1007 18:18:23.288210 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" containerName="collect-profiles" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.288227 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" containerName="collect-profiles" Oct 07 18:18:23 crc kubenswrapper[4681]: E1007 18:18:23.288272 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a2ae55-90f7-432a-bc03-aedd6db91210" containerName="tempest-tests-tempest-tests-runner" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.288280 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a2ae55-90f7-432a-bc03-aedd6db91210" containerName="tempest-tests-tempest-tests-runner" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.288477 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a2ae55-90f7-432a-bc03-aedd6db91210" containerName="tempest-tests-tempest-tests-runner" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.288499 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa55690-2f63-4e3f-a8c2-1bd59a6a16c1" containerName="collect-profiles" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.289168 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.291234 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pcscr" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.299654 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.410083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.410406 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4r6\" (UniqueName: \"kubernetes.io/projected/27c574e9-b637-4326-853e-f298321f1a1b-kube-api-access-jw4r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.511753 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.512134 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4r6\" (UniqueName: \"kubernetes.io/projected/27c574e9-b637-4326-853e-f298321f1a1b-kube-api-access-jw4r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.514571 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.533101 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4r6\" (UniqueName: \"kubernetes.io/projected/27c574e9-b637-4326-853e-f298321f1a1b-kube-api-access-jw4r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.563510 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"27c574e9-b637-4326-853e-f298321f1a1b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:23 crc kubenswrapper[4681]: I1007 18:18:23.619591 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 18:18:24 crc kubenswrapper[4681]: I1007 18:18:24.042501 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 18:18:24 crc kubenswrapper[4681]: I1007 18:18:24.053107 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 18:18:24 crc kubenswrapper[4681]: I1007 18:18:24.945743 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"27c574e9-b637-4326-853e-f298321f1a1b","Type":"ContainerStarted","Data":"4e04a5ae6e484e91b3d796515e19f718b90dce4eadcde1788750a86143a72094"} Oct 07 18:18:25 crc kubenswrapper[4681]: I1007 18:18:25.957227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"27c574e9-b637-4326-853e-f298321f1a1b","Type":"ContainerStarted","Data":"00bc54e5b6b655cb305d172529ea16d503017fd49287935dc557f75d07613390"} Oct 07 18:18:25 crc kubenswrapper[4681]: I1007 18:18:25.978778 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.0609947809999998 podStartE2EDuration="2.978759163s" podCreationTimestamp="2025-10-07 18:18:23 +0000 UTC" firstStartedPulling="2025-10-07 18:18:24.052825885 +0000 UTC m=+4507.700237450" lastFinishedPulling="2025-10-07 18:18:24.970590277 +0000 UTC m=+4508.618001832" observedRunningTime="2025-10-07 18:18:25.969908317 +0000 UTC m=+4509.617319872" watchObservedRunningTime="2025-10-07 18:18:25.978759163 +0000 UTC m=+4509.626170718" Oct 07 18:18:26 crc kubenswrapper[4681]: I1007 18:18:26.737599 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-58b7954b47-8j9j9" podUID="642b1a07-3c90-40b5-b6cb-af1d8832649b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 07 18:18:34 crc kubenswrapper[4681]: I1007 18:18:34.029242 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:18:34 crc kubenswrapper[4681]: E1007 18:18:34.030029 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.488762 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xmpv/must-gather-p95jd"] Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.490835 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.505625 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8xmpv"/"openshift-service-ca.crt" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.505916 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8xmpv"/"kube-root-ca.crt" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.588275 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8xmpv/must-gather-p95jd"] Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.683971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.684054 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdxw\" (UniqueName: \"kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.786070 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdxw\" (UniqueName: \"kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.786264 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.786756 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:42 crc kubenswrapper[4681]: I1007 18:18:42.823862 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdxw\" (UniqueName: \"kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw\") pod \"must-gather-p95jd\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:43 crc kubenswrapper[4681]: I1007 18:18:43.114457 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:18:43 crc kubenswrapper[4681]: I1007 18:18:43.632079 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8xmpv/must-gather-p95jd"] Oct 07 18:18:43 crc kubenswrapper[4681]: W1007 18:18:43.632121 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod196867f6_ac14_4388_abf3_d184f19deffb.slice/crio-10e4e3a547c6e792965e875a23988c1f8de7b63dd23c70b2fcba698dfa4f6270 WatchSource:0}: Error finding container 10e4e3a547c6e792965e875a23988c1f8de7b63dd23c70b2fcba698dfa4f6270: Status 404 returned error can't find the container with id 10e4e3a547c6e792965e875a23988c1f8de7b63dd23c70b2fcba698dfa4f6270 Oct 07 18:18:44 crc kubenswrapper[4681]: I1007 18:18:44.112548 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/must-gather-p95jd" event={"ID":"196867f6-ac14-4388-abf3-d184f19deffb","Type":"ContainerStarted","Data":"10e4e3a547c6e792965e875a23988c1f8de7b63dd23c70b2fcba698dfa4f6270"} Oct 07 18:18:49 crc kubenswrapper[4681]: I1007 18:18:49.030079 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:18:49 crc kubenswrapper[4681]: E1007 18:18:49.030739 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:18:49 crc kubenswrapper[4681]: I1007 18:18:49.155740 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/must-gather-p95jd" event={"ID":"196867f6-ac14-4388-abf3-d184f19deffb","Type":"ContainerStarted","Data":"fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc"} Oct 07 18:18:49 crc kubenswrapper[4681]: I1007 18:18:49.156282 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/must-gather-p95jd" event={"ID":"196867f6-ac14-4388-abf3-d184f19deffb","Type":"ContainerStarted","Data":"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08"} Oct 07 18:18:49 crc kubenswrapper[4681]: I1007 18:18:49.171561 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8xmpv/must-gather-p95jd" podStartSLOduration=2.333571134 podStartE2EDuration="7.171535686s" podCreationTimestamp="2025-10-07 18:18:42 +0000 UTC" firstStartedPulling="2025-10-07 18:18:43.63468297 +0000 UTC m=+4527.282094535" lastFinishedPulling="2025-10-07 18:18:48.472647532 +0000 UTC m=+4532.120059087" observedRunningTime="2025-10-07 18:18:49.169446367 +0000 UTC m=+4532.816857922" watchObservedRunningTime="2025-10-07 18:18:49.171535686 +0000 UTC m=+4532.818947241" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.066793 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-8qs2b"] Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.068515 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.071511 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8xmpv"/"default-dockercfg-rhmg5" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.214869 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkvwh\" (UniqueName: \"kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.215309 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.316778 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.316949 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkvwh\" (UniqueName: \"kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.317497 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.341870 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkvwh\" (UniqueName: \"kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh\") pod \"crc-debug-8qs2b\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:54 crc kubenswrapper[4681]: I1007 18:18:54.384175 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:18:55 crc kubenswrapper[4681]: I1007 18:18:55.207320 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" event={"ID":"cda1c49b-59ff-4fb8-a89b-32540ea78511","Type":"ContainerStarted","Data":"17aae25dde606e2698b1e429372e65bab0c057d8810d47eb5cfea7ab1b229357"} Oct 07 18:19:00 crc kubenswrapper[4681]: I1007 18:19:00.029163 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:19:00 crc kubenswrapper[4681]: E1007 18:19:00.029943 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:19:05 crc kubenswrapper[4681]: I1007 18:19:05.319855 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" event={"ID":"cda1c49b-59ff-4fb8-a89b-32540ea78511","Type":"ContainerStarted","Data":"1a4ca6d2a3b2f33743a84ca90249125df2fbc0cf6ba317c78cd031d4dff88cef"} Oct 07 18:19:05 crc kubenswrapper[4681]: I1007 18:19:05.339300 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" podStartSLOduration=0.687062186 podStartE2EDuration="11.33928254s" podCreationTimestamp="2025-10-07 18:18:54 +0000 UTC" firstStartedPulling="2025-10-07 18:18:54.417489987 +0000 UTC m=+4538.064901542" lastFinishedPulling="2025-10-07 18:19:05.069710341 +0000 UTC m=+4548.717121896" observedRunningTime="2025-10-07 18:19:05.339245599 +0000 UTC m=+4548.986657154" watchObservedRunningTime="2025-10-07 18:19:05.33928254 +0000 UTC m=+4548.986694095" Oct 07 18:19:15 crc kubenswrapper[4681]: I1007 18:19:15.029748 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:19:15 crc kubenswrapper[4681]: E1007 18:19:15.030618 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:19:28 crc kubenswrapper[4681]: I1007 18:19:28.029738 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:19:28 crc kubenswrapper[4681]: E1007 18:19:28.031188 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:19:42 crc kubenswrapper[4681]: I1007 18:19:42.029509 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:19:42 crc kubenswrapper[4681]: E1007 18:19:42.030410 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:19:57 crc kubenswrapper[4681]: I1007 18:19:57.036669 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:19:57 crc kubenswrapper[4681]: I1007 18:19:57.806098 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27"} Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.582707 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.585225 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.613825 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.780786 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99r82\" (UniqueName: \"kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.781058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.781251 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.882442 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.882709 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.882815 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99r82\" (UniqueName: \"kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.883511 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.885757 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.904518 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99r82\" (UniqueName: \"kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82\") pod \"certified-operators-8nngl\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:08 crc kubenswrapper[4681]: I1007 18:20:08.908598 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:09 crc kubenswrapper[4681]: I1007 18:20:09.505501 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:09 crc kubenswrapper[4681]: I1007 18:20:09.936084 4681 generic.go:334] "Generic (PLEG): container finished" podID="f5470119-eb69-4077-8130-2272adbd8a60" containerID="bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73" exitCode=0 Oct 07 18:20:09 crc kubenswrapper[4681]: I1007 18:20:09.936194 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerDied","Data":"bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73"} Oct 07 18:20:09 crc kubenswrapper[4681]: I1007 18:20:09.936393 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerStarted","Data":"6cd53f3247cd7cfb77b0b457c2f64de660eadeed57cd7232f0bc4a10845aff03"} Oct 07 18:20:11 crc kubenswrapper[4681]: I1007 18:20:11.962189 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerStarted","Data":"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39"} Oct 07 18:20:12 crc kubenswrapper[4681]: I1007 18:20:12.980330 4681 generic.go:334] "Generic (PLEG): container finished" podID="f5470119-eb69-4077-8130-2272adbd8a60" containerID="5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39" exitCode=0 Oct 07 18:20:12 crc kubenswrapper[4681]: I1007 18:20:12.981583 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerDied","Data":"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39"} Oct 07 18:20:13 crc kubenswrapper[4681]: I1007 18:20:13.994946 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerStarted","Data":"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162"} Oct 07 18:20:14 crc kubenswrapper[4681]: I1007 18:20:14.020345 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nngl" podStartSLOduration=2.511873267 podStartE2EDuration="6.020329642s" podCreationTimestamp="2025-10-07 18:20:08 +0000 UTC" firstStartedPulling="2025-10-07 18:20:09.949332547 +0000 UTC m=+4613.596744102" lastFinishedPulling="2025-10-07 18:20:13.457788922 +0000 UTC m=+4617.105200477" observedRunningTime="2025-10-07 18:20:14.018425789 +0000 UTC m=+4617.665837344" watchObservedRunningTime="2025-10-07 18:20:14.020329642 +0000 UTC m=+4617.667741197" Oct 07 18:20:18 crc kubenswrapper[4681]: I1007 18:20:18.911093 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:18 crc kubenswrapper[4681]: I1007 18:20:18.912751 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:18 crc kubenswrapper[4681]: I1007 18:20:18.964354 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:19 crc kubenswrapper[4681]: I1007 18:20:19.083567 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:19 crc kubenswrapper[4681]: I1007 18:20:19.207734 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.048871 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nngl" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="registry-server" containerID="cri-o://294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162" gracePeriod=2 Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.582227 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.663967 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content\") pod \"f5470119-eb69-4077-8130-2272adbd8a60\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.664116 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99r82\" (UniqueName: \"kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82\") pod \"f5470119-eb69-4077-8130-2272adbd8a60\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.664210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities\") pod \"f5470119-eb69-4077-8130-2272adbd8a60\" (UID: \"f5470119-eb69-4077-8130-2272adbd8a60\") " Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.664785 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities" (OuterVolumeSpecName: "utilities") pod "f5470119-eb69-4077-8130-2272adbd8a60" (UID: "f5470119-eb69-4077-8130-2272adbd8a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.683032 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82" (OuterVolumeSpecName: "kube-api-access-99r82") pod "f5470119-eb69-4077-8130-2272adbd8a60" (UID: "f5470119-eb69-4077-8130-2272adbd8a60"). InnerVolumeSpecName "kube-api-access-99r82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.717436 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5470119-eb69-4077-8130-2272adbd8a60" (UID: "f5470119-eb69-4077-8130-2272adbd8a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.766681 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99r82\" (UniqueName: \"kubernetes.io/projected/f5470119-eb69-4077-8130-2272adbd8a60-kube-api-access-99r82\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.766715 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:21 crc kubenswrapper[4681]: I1007 18:20:21.766724 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5470119-eb69-4077-8130-2272adbd8a60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.059500 4681 generic.go:334] "Generic (PLEG): container finished" podID="f5470119-eb69-4077-8130-2272adbd8a60" containerID="294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162" exitCode=0 Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.059541 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerDied","Data":"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162"} Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.059566 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nngl" event={"ID":"f5470119-eb69-4077-8130-2272adbd8a60","Type":"ContainerDied","Data":"6cd53f3247cd7cfb77b0b457c2f64de660eadeed57cd7232f0bc4a10845aff03"} Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.059580 4681 scope.go:117] "RemoveContainer" containerID="294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.059779 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nngl" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.092128 4681 scope.go:117] "RemoveContainer" containerID="5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.100257 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.108429 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nngl"] Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.147033 4681 scope.go:117] "RemoveContainer" containerID="bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.170442 4681 scope.go:117] "RemoveContainer" containerID="294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162" Oct 07 18:20:22 crc kubenswrapper[4681]: E1007 18:20:22.171104 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162\": container with ID starting with 294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162 not found: ID does not exist" containerID="294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.171267 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162"} err="failed to get container status \"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162\": rpc error: code = NotFound desc = could not find container \"294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162\": container with ID starting with 294dc84f9b51a223fcdac7e32e0b097922e1e76b02719d98800226380866e162 not found: ID does not exist" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.171409 4681 scope.go:117] "RemoveContainer" containerID="5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39" Oct 07 18:20:22 crc kubenswrapper[4681]: E1007 18:20:22.171832 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39\": container with ID starting with 5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39 not found: ID does not exist" containerID="5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.172012 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39"} err="failed to get container status \"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39\": rpc error: code = NotFound desc = could not find container \"5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39\": container with ID starting with 5a8eab46b7fdd85cb26377e8a7780455a70b7bac485bc46104a905e71d130a39 not found: ID does not exist" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.172102 4681 scope.go:117] "RemoveContainer" containerID="bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73" Oct 07 18:20:22 crc kubenswrapper[4681]: E1007 18:20:22.172829 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73\": container with ID starting with bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73 not found: ID does not exist" containerID="bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.172864 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73"} err="failed to get container status \"bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73\": rpc error: code = NotFound desc = could not find container \"bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73\": container with ID starting with bd07f4b9ef1bdb3230ff13467c08c5da45cdef80ed9f972b8c1bdaf6a0685f73 not found: ID does not exist" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.965481 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d8b9fbb46-6wjkq_07f40489-1614-45c8-864b-2288473c7c1d/barbican-api-log/0.log" Oct 07 18:20:22 crc kubenswrapper[4681]: I1007 18:20:22.980617 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d8b9fbb46-6wjkq_07f40489-1614-45c8-864b-2288473c7c1d/barbican-api/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.050910 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5470119-eb69-4077-8130-2272adbd8a60" path="/var/lib/kubelet/pods/f5470119-eb69-4077-8130-2272adbd8a60/volumes" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.248758 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d869d8764-5bjtz_f35c1eb1-692d-4484-a686-5ad0ce63744b/barbican-keystone-listener-log/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.303936 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d869d8764-5bjtz_f35c1eb1-692d-4484-a686-5ad0ce63744b/barbican-keystone-listener/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.483495 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57b57fb795-6426k_62d6d4e2-d1d4-4967-82e9-143266e1165b/barbican-worker/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.538460 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57b57fb795-6426k_62d6d4e2-d1d4-4967-82e9-143266e1165b/barbican-worker-log/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.762167 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l_5da1ef34-103f-4687-8454-89abe7b61f54/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:23 crc kubenswrapper[4681]: I1007 18:20:23.993126 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/ceilometer-central-agent/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.119493 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/ceilometer-notification-agent/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.131786 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/proxy-httpd/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.245976 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/sg-core/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.430690 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7/cinder-api/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.494748 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7/cinder-api-log/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.690380 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9/cinder-scheduler/0.log" Oct 07 18:20:24 crc kubenswrapper[4681]: I1007 18:20:24.795364 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9/probe/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.026521 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp_44ed5213-33ec-47cd-bc96-8d536fa86f61/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.245178 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d494h_fea47565-ef99-4b31-869a-075d2d8331e9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.416325 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jktwm_13708146-56fd-426d-988d-d6e66d01cadb/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.623039 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/init/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.751521 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/init/0.log" Oct 07 18:20:25 crc kubenswrapper[4681]: I1007 18:20:25.940349 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/dnsmasq-dns/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.047113 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7_41bd87d5-77d6-4866-b9b8-aaed777393b5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.176258 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dbe731b8-1f1d-449c-accb-3cb97696d1ae/glance-httpd/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.333449 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dbe731b8-1f1d-449c-accb-3cb97696d1ae/glance-log/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.415097 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1469d2bd-93c0-414a-951e-175bc73f377e/glance-log/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.428504 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1469d2bd-93c0-414a-951e-175bc73f377e/glance-httpd/0.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.730119 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon/2.log" Oct 07 18:20:26 crc kubenswrapper[4681]: I1007 18:20:26.757814 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon/1.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.175169 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon-log/0.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.189267 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg_07a9584a-a546-4ec3-ba13-1f0db8c3ba39/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.297125 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9qxmf_51176e78-6a59-4fe2-abc5-88a3177b9ee0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.560697 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29331001-dj57c_0205aec3-2b1b-427b-9359-40d4118c7f59/keystone-cron/0.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.637127 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_92e5095e-22e9-46b1-900a-492f827a05eb/kube-state-metrics/0.log" Oct 07 18:20:27 crc kubenswrapper[4681]: I1007 18:20:27.904821 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db8ffcf86-wnnfn_ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd/keystone-api/0.log" Oct 07 18:20:28 crc kubenswrapper[4681]: I1007 18:20:28.072198 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-djlbc_3c08afe2-1291-4ac9-8eb5-493f9cff1c4d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:28 crc kubenswrapper[4681]: I1007 18:20:28.813672 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b94d78545-dfdgb_c77522e8-d403-4227-9740-21dca2843c58/neutron-httpd/0.log" Oct 07 18:20:28 crc kubenswrapper[4681]: I1007 18:20:28.944334 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b94d78545-dfdgb_c77522e8-d403-4227-9740-21dca2843c58/neutron-api/0.log" Oct 07 18:20:29 crc kubenswrapper[4681]: I1007 18:20:29.070428 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw_d1f9c32e-011c-49a9-8319-4aeb852fa976/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:30 crc kubenswrapper[4681]: I1007 18:20:30.054908 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ef096ee9-933c-44da-a4b7-6cc5b62ecc49/nova-cell0-conductor-conductor/0.log" Oct 07 18:20:30 crc kubenswrapper[4681]: I1007 18:20:30.670049 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9241da9a-f1bd-4d93-bd72-f84e5dd85083/nova-api-log/0.log" Oct 07 18:20:30 crc kubenswrapper[4681]: I1007 18:20:30.728623 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a1765198-be66-424a-b57a-187a6b62c4bc/nova-cell1-conductor-conductor/0.log" Oct 07 18:20:30 crc kubenswrapper[4681]: I1007 18:20:30.939097 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9241da9a-f1bd-4d93-bd72-f84e5dd85083/nova-api-api/0.log" Oct 07 18:20:31 crc kubenswrapper[4681]: I1007 18:20:31.055329 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_48284d8c-6f51-4fa0-ae29-b933b93a2411/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 18:20:31 crc kubenswrapper[4681]: I1007 18:20:31.791621 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rk6x9_a7d237e9-d752-4244-8f32-be01a5ca3f6f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:31 crc kubenswrapper[4681]: I1007 18:20:31.880210 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e80aacf-4a39-48b9-96c3-692936cf2855/nova-metadata-log/0.log" Oct 07 18:20:32 crc kubenswrapper[4681]: I1007 18:20:32.860235 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_51290795-4e81-4099-ab84-e9529128d78a/nova-scheduler-scheduler/0.log" Oct 07 18:20:32 crc kubenswrapper[4681]: I1007 18:20:32.960147 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/mysql-bootstrap/0.log" Oct 07 18:20:33 crc kubenswrapper[4681]: I1007 18:20:33.266221 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/mysql-bootstrap/0.log" Oct 07 18:20:33 crc kubenswrapper[4681]: I1007 18:20:33.281789 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/galera/0.log" Oct 07 18:20:33 crc kubenswrapper[4681]: I1007 18:20:33.601667 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/mysql-bootstrap/0.log" Oct 07 18:20:33 crc kubenswrapper[4681]: I1007 18:20:33.773714 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/mysql-bootstrap/0.log" Oct 07 18:20:33 crc kubenswrapper[4681]: I1007 18:20:33.866318 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/galera/0.log" Oct 07 18:20:34 crc kubenswrapper[4681]: I1007 18:20:34.230232 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e80aacf-4a39-48b9-96c3-692936cf2855/nova-metadata-metadata/0.log" Oct 07 18:20:34 crc kubenswrapper[4681]: I1007 18:20:34.625611 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a253ef31-4d02-4fbd-8842-cf2fbe41f307/openstackclient/0.log" Oct 07 18:20:34 crc kubenswrapper[4681]: I1007 18:20:34.771027 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-v4f4x_361da154-8a78-497d-9bb1-78335f5a286d/openstack-network-exporter/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.025799 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server-init/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.214472 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server-init/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.252856 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovs-vswitchd/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.446502 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.636613 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xhwkc_8be45f14-7feb-40fa-a0a8-919c6d8cd052/ovn-controller/0.log" Oct 07 18:20:35 crc kubenswrapper[4681]: I1007 18:20:35.967953 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wd6mz_eb92dd00-8b97-470f-9f2c-3ff1ee783f93/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.067729 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1209f82a-cbcc-4833-98f0-6e2a07b53aeb/openstack-network-exporter/0.log" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.186498 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1209f82a-cbcc-4833-98f0-6e2a07b53aeb/ovn-northd/0.log" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.322421 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9de0f04a-f2ed-48ee-a873-8a02b70fb146/openstack-network-exporter/0.log" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.378086 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:36 crc kubenswrapper[4681]: E1007 18:20:36.378518 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="extract-utilities" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.378533 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="extract-utilities" Oct 07 18:20:36 crc kubenswrapper[4681]: E1007 18:20:36.378568 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="extract-content" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.378576 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="extract-content" Oct 07 18:20:36 crc kubenswrapper[4681]: E1007 18:20:36.378593 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="registry-server" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.378598 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="registry-server" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.378795 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5470119-eb69-4077-8130-2272adbd8a60" containerName="registry-server" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.380039 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.389407 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.546711 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69nr\" (UniqueName: \"kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.546765 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.546867 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.574296 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9de0f04a-f2ed-48ee-a873-8a02b70fb146/ovsdbserver-nb/0.log" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.648500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.648572 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.648683 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69nr\" (UniqueName: \"kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.649391 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.650100 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.666508 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69nr\" (UniqueName: \"kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr\") pod \"community-operators-hcjvj\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:36 crc kubenswrapper[4681]: I1007 18:20:36.724422 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:37 crc kubenswrapper[4681]: I1007 18:20:37.135766 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5d2debf-c5bb-47fa-9d33-69c2f549a3e0/openstack-network-exporter/0.log" Oct 07 18:20:37 crc kubenswrapper[4681]: I1007 18:20:37.392319 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5d2debf-c5bb-47fa-9d33-69c2f549a3e0/ovsdbserver-sb/0.log" Oct 07 18:20:37 crc kubenswrapper[4681]: I1007 18:20:37.658616 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:37 crc kubenswrapper[4681]: I1007 18:20:37.838713 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68578dd4f6-bzx29_0d112a4c-ca20-4593-ac26-4e88a56ca00a/placement-api/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.024466 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68578dd4f6-bzx29_0d112a4c-ca20-4593-ac26-4e88a56ca00a/placement-log/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.079801 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/setup-container/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.282933 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/rabbitmq/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.338093 4681 generic.go:334] "Generic (PLEG): container finished" podID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerID="8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5" exitCode=0 Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.338153 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerDied","Data":"8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5"} Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.338180 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerStarted","Data":"a9cb2d0993af4de84f6cb632df0d1ff38db17e6e848f6f7ca22668b88bc51ea9"} Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.374769 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/setup-container/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.544117 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/setup-container/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.794310 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/rabbitmq/0.log" Oct 07 18:20:38 crc kubenswrapper[4681]: I1007 18:20:38.807776 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/setup-container/0.log" Oct 07 18:20:39 crc kubenswrapper[4681]: I1007 18:20:39.206671 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl_12f3d295-f471-4e4d-9884-3bf34dab377f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:39 crc kubenswrapper[4681]: I1007 18:20:39.357660 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerStarted","Data":"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450"} Oct 07 18:20:39 crc kubenswrapper[4681]: I1007 18:20:39.391144 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hmgls_0cdf5aef-8c9a-4cd5-8f38-2f368fe245df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:39 crc kubenswrapper[4681]: I1007 18:20:39.577994 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj_a1d74e17-5142-40f0-9847-0f9ee5e33f90/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:39 crc kubenswrapper[4681]: I1007 18:20:39.945931 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rp758_da2c6bbc-c4e1-4767-8815-fbc4cada002a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:40 crc kubenswrapper[4681]: I1007 18:20:40.178738 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qlsmk_4022d381-ce12-4c86-9368-4089026a66d3/ssh-known-hosts-edpm-deployment/0.log" Oct 07 18:20:40 crc kubenswrapper[4681]: I1007 18:20:40.516941 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58b7954b47-8j9j9_642b1a07-3c90-40b5-b6cb-af1d8832649b/proxy-server/0.log" Oct 07 18:20:40 crc kubenswrapper[4681]: I1007 18:20:40.632224 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58b7954b47-8j9j9_642b1a07-3c90-40b5-b6cb-af1d8832649b/proxy-httpd/0.log" Oct 07 18:20:40 crc kubenswrapper[4681]: I1007 18:20:40.878062 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-npftw_51ee2d02-f1ea-4e04-817a-c08925a2078d/swift-ring-rebalance/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.113147 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-auditor/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.188223 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-reaper/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.311337 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-replicator/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.378074 4681 generic.go:334] "Generic (PLEG): container finished" podID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerID="f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450" exitCode=0 Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.378294 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerDied","Data":"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450"} Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.802127 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-server/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.824480 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-auditor/0.log" Oct 07 18:20:41 crc kubenswrapper[4681]: I1007 18:20:41.968048 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-replicator/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.117716 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-server/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.183735 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-updater/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.405404 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerStarted","Data":"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0"} Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.440421 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hcjvj" podStartSLOduration=2.958294804 podStartE2EDuration="6.440396546s" podCreationTimestamp="2025-10-07 18:20:36 +0000 UTC" firstStartedPulling="2025-10-07 18:20:38.340673861 +0000 UTC m=+4641.988085416" lastFinishedPulling="2025-10-07 18:20:41.822775603 +0000 UTC m=+4645.470187158" observedRunningTime="2025-10-07 18:20:42.432779243 +0000 UTC m=+4646.080190788" watchObservedRunningTime="2025-10-07 18:20:42.440396546 +0000 UTC m=+4646.087808101" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.456370 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-auditor/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.465013 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-expirer/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.663861 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-replicator/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.699382 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-server/0.log" Oct 07 18:20:42 crc kubenswrapper[4681]: I1007 18:20:42.761613 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-updater/0.log" Oct 07 18:20:43 crc kubenswrapper[4681]: I1007 18:20:43.072941 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/rsync/0.log" Oct 07 18:20:43 crc kubenswrapper[4681]: I1007 18:20:43.117519 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/swift-recon-cron/0.log" Oct 07 18:20:43 crc kubenswrapper[4681]: I1007 18:20:43.813803 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_01a2ae55-90f7-432a-bc03-aedd6db91210/tempest-tests-tempest-tests-runner/0.log" Oct 07 18:20:43 crc kubenswrapper[4681]: I1007 18:20:43.841988 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rbskw_650f08d2-bbd6-4cf7-b8d1-5923a4075672/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:44 crc kubenswrapper[4681]: I1007 18:20:44.154523 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_27c574e9-b637-4326-853e-f298321f1a1b/test-operator-logs-container/0.log" Oct 07 18:20:44 crc kubenswrapper[4681]: I1007 18:20:44.357900 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-49wz2_d3ee9809-6f86-44fb-9b11-163437e7750e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:20:46 crc kubenswrapper[4681]: I1007 18:20:46.725988 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:46 crc kubenswrapper[4681]: I1007 18:20:46.726378 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:46 crc kubenswrapper[4681]: I1007 18:20:46.818458 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:47 crc kubenswrapper[4681]: I1007 18:20:47.496192 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:47 crc kubenswrapper[4681]: I1007 18:20:47.588673 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:49 crc kubenswrapper[4681]: I1007 18:20:49.461922 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hcjvj" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="registry-server" containerID="cri-o://dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0" gracePeriod=2 Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.047590 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.188821 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n69nr\" (UniqueName: \"kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr\") pod \"966dfe91-1406-4a6f-ac1c-ae9960545c07\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.189113 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities\") pod \"966dfe91-1406-4a6f-ac1c-ae9960545c07\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.189167 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content\") pod \"966dfe91-1406-4a6f-ac1c-ae9960545c07\" (UID: \"966dfe91-1406-4a6f-ac1c-ae9960545c07\") " Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.190824 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities" (OuterVolumeSpecName: "utilities") pod "966dfe91-1406-4a6f-ac1c-ae9960545c07" (UID: "966dfe91-1406-4a6f-ac1c-ae9960545c07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.222394 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr" (OuterVolumeSpecName: "kube-api-access-n69nr") pod "966dfe91-1406-4a6f-ac1c-ae9960545c07" (UID: "966dfe91-1406-4a6f-ac1c-ae9960545c07"). InnerVolumeSpecName "kube-api-access-n69nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.254060 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "966dfe91-1406-4a6f-ac1c-ae9960545c07" (UID: "966dfe91-1406-4a6f-ac1c-ae9960545c07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.290765 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.290792 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966dfe91-1406-4a6f-ac1c-ae9960545c07-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.290803 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n69nr\" (UniqueName: \"kubernetes.io/projected/966dfe91-1406-4a6f-ac1c-ae9960545c07-kube-api-access-n69nr\") on node \"crc\" DevicePath \"\"" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.475737 4681 generic.go:334] "Generic (PLEG): container finished" podID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerID="dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0" exitCode=0 Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.475778 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerDied","Data":"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0"} Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.475805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcjvj" event={"ID":"966dfe91-1406-4a6f-ac1c-ae9960545c07","Type":"ContainerDied","Data":"a9cb2d0993af4de84f6cb632df0d1ff38db17e6e848f6f7ca22668b88bc51ea9"} Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.475830 4681 scope.go:117] "RemoveContainer" containerID="dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.475925 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcjvj" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.497960 4681 scope.go:117] "RemoveContainer" containerID="f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.535240 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.550645 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hcjvj"] Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.557766 4681 scope.go:117] "RemoveContainer" containerID="8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.625363 4681 scope.go:117] "RemoveContainer" containerID="dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0" Oct 07 18:20:50 crc kubenswrapper[4681]: E1007 18:20:50.625811 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0\": container with ID starting with dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0 not found: ID does not exist" containerID="dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.625857 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0"} err="failed to get container status \"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0\": rpc error: code = NotFound desc = could not find container \"dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0\": container with ID starting with dd2f68df6f954bc3969c36dbf4e3298d8f5e4afbea6ce45f2d0b657f100e46a0 not found: ID does not exist" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.625905 4681 scope.go:117] "RemoveContainer" containerID="f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450" Oct 07 18:20:50 crc kubenswrapper[4681]: E1007 18:20:50.626252 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450\": container with ID starting with f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450 not found: ID does not exist" containerID="f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.626286 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450"} err="failed to get container status \"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450\": rpc error: code = NotFound desc = could not find container \"f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450\": container with ID starting with f5c0c2512674439233f85f63fa0b1cff2f3c8099aa47071dc11e6a3719ada450 not found: ID does not exist" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.626309 4681 scope.go:117] "RemoveContainer" containerID="8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5" Oct 07 18:20:50 crc kubenswrapper[4681]: E1007 18:20:50.630960 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5\": container with ID starting with 8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5 not found: ID does not exist" containerID="8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5" Oct 07 18:20:50 crc kubenswrapper[4681]: I1007 18:20:50.630991 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5"} err="failed to get container status \"8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5\": rpc error: code = NotFound desc = could not find container \"8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5\": container with ID starting with 8f1462e8e26c939830fbf51473009bb0f4c7837962d4424be48869ff83d6b0e5 not found: ID does not exist" Oct 07 18:20:51 crc kubenswrapper[4681]: I1007 18:20:51.037828 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" path="/var/lib/kubelet/pods/966dfe91-1406-4a6f-ac1c-ae9960545c07/volumes" Oct 07 18:20:51 crc kubenswrapper[4681]: I1007 18:20:51.070267 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1aa4b182-3cf3-4e5d-b59d-5e00004cb912/memcached/0.log" Oct 07 18:21:21 crc kubenswrapper[4681]: I1007 18:21:21.762640 4681 generic.go:334] "Generic (PLEG): container finished" podID="cda1c49b-59ff-4fb8-a89b-32540ea78511" containerID="1a4ca6d2a3b2f33743a84ca90249125df2fbc0cf6ba317c78cd031d4dff88cef" exitCode=0 Oct 07 18:21:21 crc kubenswrapper[4681]: I1007 18:21:21.762715 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" event={"ID":"cda1c49b-59ff-4fb8-a89b-32540ea78511","Type":"ContainerDied","Data":"1a4ca6d2a3b2f33743a84ca90249125df2fbc0cf6ba317c78cd031d4dff88cef"} Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.867411 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.903628 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-8qs2b"] Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.911830 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-8qs2b"] Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.970748 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkvwh\" (UniqueName: \"kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh\") pod \"cda1c49b-59ff-4fb8-a89b-32540ea78511\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.970929 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host\") pod \"cda1c49b-59ff-4fb8-a89b-32540ea78511\" (UID: \"cda1c49b-59ff-4fb8-a89b-32540ea78511\") " Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.971039 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host" (OuterVolumeSpecName: "host") pod "cda1c49b-59ff-4fb8-a89b-32540ea78511" (UID: "cda1c49b-59ff-4fb8-a89b-32540ea78511"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:21:22 crc kubenswrapper[4681]: I1007 18:21:22.971475 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cda1c49b-59ff-4fb8-a89b-32540ea78511-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:23 crc kubenswrapper[4681]: I1007 18:21:23.404900 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh" (OuterVolumeSpecName: "kube-api-access-nkvwh") pod "cda1c49b-59ff-4fb8-a89b-32540ea78511" (UID: "cda1c49b-59ff-4fb8-a89b-32540ea78511"). InnerVolumeSpecName "kube-api-access-nkvwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:21:23 crc kubenswrapper[4681]: I1007 18:21:23.480103 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkvwh\" (UniqueName: \"kubernetes.io/projected/cda1c49b-59ff-4fb8-a89b-32540ea78511-kube-api-access-nkvwh\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:23 crc kubenswrapper[4681]: I1007 18:21:23.781032 4681 scope.go:117] "RemoveContainer" containerID="1a4ca6d2a3b2f33743a84ca90249125df2fbc0cf6ba317c78cd031d4dff88cef" Oct 07 18:21:23 crc kubenswrapper[4681]: I1007 18:21:23.781116 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-8qs2b" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.526691 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-xlpnv"] Oct 07 18:21:24 crc kubenswrapper[4681]: E1007 18:21:24.527334 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda1c49b-59ff-4fb8-a89b-32540ea78511" containerName="container-00" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527349 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda1c49b-59ff-4fb8-a89b-32540ea78511" containerName="container-00" Oct 07 18:21:24 crc kubenswrapper[4681]: E1007 18:21:24.527363 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="registry-server" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527369 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="registry-server" Oct 07 18:21:24 crc kubenswrapper[4681]: E1007 18:21:24.527384 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="extract-utilities" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527390 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="extract-utilities" Oct 07 18:21:24 crc kubenswrapper[4681]: E1007 18:21:24.527401 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="extract-content" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527406 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="extract-content" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527564 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda1c49b-59ff-4fb8-a89b-32540ea78511" containerName="container-00" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.527577 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="966dfe91-1406-4a6f-ac1c-ae9960545c07" containerName="registry-server" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.528198 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.530237 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8xmpv"/"default-dockercfg-rhmg5" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.598182 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.598247 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b797d\" (UniqueName: \"kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.700506 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.700632 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b797d\" (UniqueName: \"kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.700647 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.717857 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b797d\" (UniqueName: \"kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d\") pod \"crc-debug-xlpnv\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:24 crc kubenswrapper[4681]: I1007 18:21:24.845727 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:25 crc kubenswrapper[4681]: I1007 18:21:25.039046 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda1c49b-59ff-4fb8-a89b-32540ea78511" path="/var/lib/kubelet/pods/cda1c49b-59ff-4fb8-a89b-32540ea78511/volumes" Oct 07 18:21:25 crc kubenswrapper[4681]: I1007 18:21:25.801870 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" event={"ID":"ea858973-c1af-4614-a41e-634aca3cbc25","Type":"ContainerStarted","Data":"e81f6d20256dff64f37afa30652e7fb518e875c5de3e690866aa62be9749171e"} Oct 07 18:21:25 crc kubenswrapper[4681]: I1007 18:21:25.802243 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" event={"ID":"ea858973-c1af-4614-a41e-634aca3cbc25","Type":"ContainerStarted","Data":"a67c565b6ce53119f13da5947aa9ba7bef98b02d4f89ac70ed31d4b3a009ab90"} Oct 07 18:21:25 crc kubenswrapper[4681]: I1007 18:21:25.815736 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" podStartSLOduration=1.815721594 podStartE2EDuration="1.815721594s" podCreationTimestamp="2025-10-07 18:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:21:25.813619035 +0000 UTC m=+4689.461030590" watchObservedRunningTime="2025-10-07 18:21:25.815721594 +0000 UTC m=+4689.463133149" Oct 07 18:21:26 crc kubenswrapper[4681]: I1007 18:21:26.819643 4681 generic.go:334] "Generic (PLEG): container finished" podID="ea858973-c1af-4614-a41e-634aca3cbc25" containerID="e81f6d20256dff64f37afa30652e7fb518e875c5de3e690866aa62be9749171e" exitCode=0 Oct 07 18:21:26 crc kubenswrapper[4681]: I1007 18:21:26.819976 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" event={"ID":"ea858973-c1af-4614-a41e-634aca3cbc25","Type":"ContainerDied","Data":"e81f6d20256dff64f37afa30652e7fb518e875c5de3e690866aa62be9749171e"} Oct 07 18:21:27 crc kubenswrapper[4681]: I1007 18:21:27.978834 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.157726 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host\") pod \"ea858973-c1af-4614-a41e-634aca3cbc25\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.158218 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b797d\" (UniqueName: \"kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d\") pod \"ea858973-c1af-4614-a41e-634aca3cbc25\" (UID: \"ea858973-c1af-4614-a41e-634aca3cbc25\") " Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.158681 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host" (OuterVolumeSpecName: "host") pod "ea858973-c1af-4614-a41e-634aca3cbc25" (UID: "ea858973-c1af-4614-a41e-634aca3cbc25"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.158931 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea858973-c1af-4614-a41e-634aca3cbc25-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.165701 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d" (OuterVolumeSpecName: "kube-api-access-b797d") pod "ea858973-c1af-4614-a41e-634aca3cbc25" (UID: "ea858973-c1af-4614-a41e-634aca3cbc25"). InnerVolumeSpecName "kube-api-access-b797d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.260343 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b797d\" (UniqueName: \"kubernetes.io/projected/ea858973-c1af-4614-a41e-634aca3cbc25-kube-api-access-b797d\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.852849 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" event={"ID":"ea858973-c1af-4614-a41e-634aca3cbc25","Type":"ContainerDied","Data":"a67c565b6ce53119f13da5947aa9ba7bef98b02d4f89ac70ed31d4b3a009ab90"} Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.852923 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67c565b6ce53119f13da5947aa9ba7bef98b02d4f89ac70ed31d4b3a009ab90" Oct 07 18:21:28 crc kubenswrapper[4681]: I1007 18:21:28.852926 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-xlpnv" Oct 07 18:21:34 crc kubenswrapper[4681]: I1007 18:21:34.282808 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-xlpnv"] Oct 07 18:21:34 crc kubenswrapper[4681]: I1007 18:21:34.292587 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-xlpnv"] Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.044603 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea858973-c1af-4614-a41e-634aca3cbc25" path="/var/lib/kubelet/pods/ea858973-c1af-4614-a41e-634aca3cbc25/volumes" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.443408 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-7qvs7"] Oct 07 18:21:35 crc kubenswrapper[4681]: E1007 18:21:35.443827 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea858973-c1af-4614-a41e-634aca3cbc25" containerName="container-00" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.443839 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea858973-c1af-4614-a41e-634aca3cbc25" containerName="container-00" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.444059 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea858973-c1af-4614-a41e-634aca3cbc25" containerName="container-00" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.444761 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.446817 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8xmpv"/"default-dockercfg-rhmg5" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.593117 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.593247 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fps4w\" (UniqueName: \"kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.695072 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.695162 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fps4w\" (UniqueName: \"kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.695239 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.720683 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fps4w\" (UniqueName: \"kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w\") pod \"crc-debug-7qvs7\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.762973 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:35 crc kubenswrapper[4681]: I1007 18:21:35.918163 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" event={"ID":"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6","Type":"ContainerStarted","Data":"ced6d35deec38d35b931a22858b6ae4dd700e45fe2f8fb5ee6ac0f53b939f3b6"} Oct 07 18:21:36 crc kubenswrapper[4681]: I1007 18:21:36.926117 4681 generic.go:334] "Generic (PLEG): container finished" podID="5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" containerID="548720473fdd55c8db1f2285745c6258132b15d3d5ba85cdce08c3b07d7b5221" exitCode=0 Oct 07 18:21:36 crc kubenswrapper[4681]: I1007 18:21:36.926218 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" event={"ID":"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6","Type":"ContainerDied","Data":"548720473fdd55c8db1f2285745c6258132b15d3d5ba85cdce08c3b07d7b5221"} Oct 07 18:21:36 crc kubenswrapper[4681]: I1007 18:21:36.965970 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-7qvs7"] Oct 07 18:21:36 crc kubenswrapper[4681]: I1007 18:21:36.980802 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xmpv/crc-debug-7qvs7"] Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.045207 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.242615 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fps4w\" (UniqueName: \"kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w\") pod \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.243134 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host\") pod \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\" (UID: \"5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6\") " Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.243228 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host" (OuterVolumeSpecName: "host") pod "5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" (UID: "5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.244105 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.262220 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w" (OuterVolumeSpecName: "kube-api-access-fps4w") pod "5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" (UID: "5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6"). InnerVolumeSpecName "kube-api-access-fps4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.345009 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fps4w\" (UniqueName: \"kubernetes.io/projected/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6-kube-api-access-fps4w\") on node \"crc\" DevicePath \"\"" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.813234 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.950007 4681 scope.go:117] "RemoveContainer" containerID="548720473fdd55c8db1f2285745c6258132b15d3d5ba85cdce08c3b07d7b5221" Oct 07 18:21:38 crc kubenswrapper[4681]: I1007 18:21:38.950177 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/crc-debug-7qvs7" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.034135 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.045443 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" path="/var/lib/kubelet/pods/5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6/volumes" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.046533 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.066184 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.248913 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/extract/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.255931 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.303379 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.534205 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-dhcz7_fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3/kube-rbac-proxy/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.540773 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-dhcz7_fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3/manager/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.620336 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-c8nzf_c7125c26-53ab-471e-bf33-05265e3f571a/kube-rbac-proxy/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.726009 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9qrr7_72f6dfae-3a77-46ad-874b-c94d9059566c/kube-rbac-proxy/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.779759 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-c8nzf_c7125c26-53ab-471e-bf33-05265e3f571a/manager/0.log" Oct 07 18:21:39 crc kubenswrapper[4681]: I1007 18:21:39.891494 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9qrr7_72f6dfae-3a77-46ad-874b-c94d9059566c/manager/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.043152 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-rhvt8_049764d0-d62e-4553-9628-3d1b7258d126/kube-rbac-proxy/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.097608 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-rhvt8_049764d0-d62e-4553-9628-3d1b7258d126/manager/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.146147 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:21:40 crc kubenswrapper[4681]: E1007 18:21:40.146735 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" containerName="container-00" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.146760 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" containerName="container-00" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.147015 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eeb8852-c1db-4c3c-8a3e-9c07e5159cd6" containerName="container-00" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.148666 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.182788 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.258019 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-98wq4_c3da478d-c5f4-473c-9848-740845c9adf1/kube-rbac-proxy/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.291204 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.291329 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.291450 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mltk\" (UniqueName: \"kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.390515 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-98wq4_c3da478d-c5f4-473c-9848-740845c9adf1/manager/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.393035 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mltk\" (UniqueName: \"kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.393087 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.393187 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.393740 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.393788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.415759 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mltk\" (UniqueName: \"kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk\") pod \"redhat-operators-qmcj7\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.469577 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.602694 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-8qj4n_8e8c5ada-0313-4a16-b9cd-17d39ce932ca/manager/0.log" Oct 07 18:21:40 crc kubenswrapper[4681]: I1007 18:21:40.722371 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-8qj4n_8e8c5ada-0313-4a16-b9cd-17d39ce932ca/kube-rbac-proxy/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.000401 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-9fwcg_f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4/kube-rbac-proxy/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.133140 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.227907 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-9fwcg_f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4/manager/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.278779 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-mt5xr_049a3d2e-6274-44c0-8b56-d19e8d8b1cfc/kube-rbac-proxy/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.432497 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-mt5xr_049a3d2e-6274-44c0-8b56-d19e8d8b1cfc/manager/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.831160 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8m6q6_bd602c09-19c5-45a7-b8fa-4202e147bbf9/kube-rbac-proxy/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.838565 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8m6q6_bd602c09-19c5-45a7-b8fa-4202e147bbf9/manager/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.977343 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-spqlq_5cc0eff1-427a-4489-8957-f5148e6a0630/kube-rbac-proxy/0.log" Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.987603 4681 generic.go:334] "Generic (PLEG): container finished" podID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerID="2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66" exitCode=0 Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.987654 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerDied","Data":"2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66"} Oct 07 18:21:41 crc kubenswrapper[4681]: I1007 18:21:41.987689 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerStarted","Data":"a91787c1b60d79c31a38809cc6cad7b07368e1eb0ad4ff041d8f6dedf8eaf039"} Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.071357 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-spqlq_5cc0eff1-427a-4489-8957-f5148e6a0630/manager/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.222516 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8_720e687c-21aa-4f31-bc4f-7be0f836ec16/manager/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.224095 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8_720e687c-21aa-4f31-bc4f-7be0f836ec16/kube-rbac-proxy/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.391422 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-plkk5_52146033-65f3-42f4-b0b8-2b550445305f/kube-rbac-proxy/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.516290 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-plkk5_52146033-65f3-42f4-b0b8-2b550445305f/manager/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.554635 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vh9d7_9c9bc247-6ea6-486c-956c-292930b2c111/kube-rbac-proxy/0.log" Oct 07 18:21:42 crc kubenswrapper[4681]: I1007 18:21:42.722735 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vh9d7_9c9bc247-6ea6-486c-956c-292930b2c111/manager/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.101210 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-t75k9_c318e2b6-9014-471c-b54d-de14e50a1dfe/manager/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.172804 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-t75k9_c318e2b6-9014-471c-b54d-de14e50a1dfe/kube-rbac-proxy/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.173413 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665clkrns_a7636f86-a942-4f89-bc80-01a3ce70c13e/manager/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.175934 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665clkrns_a7636f86-a942-4f89-bc80-01a3ce70c13e/kube-rbac-proxy/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.396780 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-vqctw_66b31094-5895-41aa-a268-fd2d13990f9f/kube-rbac-proxy/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.499420 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-pv9kh_9e7a0d41-92ad-4dd7-b836-04c049817f6f/kube-rbac-proxy/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.834246 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w4jqq_0613f93f-af7c-4a36-8baa-642a076f5666/registry-server/0.log" Oct 07 18:21:43 crc kubenswrapper[4681]: I1007 18:21:43.844704 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-pv9kh_9e7a0d41-92ad-4dd7-b836-04c049817f6f/operator/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.016291 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerStarted","Data":"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c"} Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.065009 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-v7gh2_10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28/kube-rbac-proxy/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.160414 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-688l4_97a43b61-b120-4613-9b2a-603e1d90878a/kube-rbac-proxy/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.208304 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-v7gh2_10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28/manager/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.544529 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-vqctw_66b31094-5895-41aa-a268-fd2d13990f9f/manager/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.656022 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv_357d30fc-7c29-4bea-a20a-926b5723bcb0/operator/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.658189 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-688l4_97a43b61-b120-4613-9b2a-603e1d90878a/manager/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.758760 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bxvlr_df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4/kube-rbac-proxy/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.875986 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bxvlr_df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4/manager/0.log" Oct 07 18:21:44 crc kubenswrapper[4681]: I1007 18:21:44.935281 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-zhppp_1a3899b1-53e5-413b-b1c1-c7d2f2274b75/kube-rbac-proxy/0.log" Oct 07 18:21:45 crc kubenswrapper[4681]: I1007 18:21:45.091773 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-zhppp_1a3899b1-53e5-413b-b1c1-c7d2f2274b75/manager/0.log" Oct 07 18:21:45 crc kubenswrapper[4681]: I1007 18:21:45.181629 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-svd7g_be78a905-7f1e-4ea1-baf4-5f84246df65f/kube-rbac-proxy/0.log" Oct 07 18:21:45 crc kubenswrapper[4681]: I1007 18:21:45.232771 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-svd7g_be78a905-7f1e-4ea1-baf4-5f84246df65f/manager/0.log" Oct 07 18:21:45 crc kubenswrapper[4681]: I1007 18:21:45.298996 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gn84j_6e4f29f4-5ec2-4476-9153-954cc984443f/kube-rbac-proxy/0.log" Oct 07 18:21:45 crc kubenswrapper[4681]: I1007 18:21:45.332711 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gn84j_6e4f29f4-5ec2-4476-9153-954cc984443f/manager/0.log" Oct 07 18:21:46 crc kubenswrapper[4681]: E1007 18:21:46.275514 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4751be0_208d_46d8_83c6_2fc35f3f9b24.slice/crio-4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c.scope\": RecentStats: unable to find data in memory cache]" Oct 07 18:21:47 crc kubenswrapper[4681]: I1007 18:21:47.044623 4681 generic.go:334] "Generic (PLEG): container finished" podID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerID="4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c" exitCode=0 Oct 07 18:21:47 crc kubenswrapper[4681]: I1007 18:21:47.044715 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerDied","Data":"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c"} Oct 07 18:21:48 crc kubenswrapper[4681]: I1007 18:21:48.055145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerStarted","Data":"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346"} Oct 07 18:21:48 crc kubenswrapper[4681]: I1007 18:21:48.075227 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qmcj7" podStartSLOduration=2.536861575 podStartE2EDuration="8.075203681s" podCreationTimestamp="2025-10-07 18:21:40 +0000 UTC" firstStartedPulling="2025-10-07 18:21:41.98996283 +0000 UTC m=+4705.637374375" lastFinishedPulling="2025-10-07 18:21:47.528304926 +0000 UTC m=+4711.175716481" observedRunningTime="2025-10-07 18:21:48.070258453 +0000 UTC m=+4711.717670028" watchObservedRunningTime="2025-10-07 18:21:48.075203681 +0000 UTC m=+4711.722615236" Oct 07 18:21:50 crc kubenswrapper[4681]: I1007 18:21:50.470418 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:50 crc kubenswrapper[4681]: I1007 18:21:50.471094 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:21:51 crc kubenswrapper[4681]: I1007 18:21:51.523431 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qmcj7" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="registry-server" probeResult="failure" output=< Oct 07 18:21:51 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:21:51 crc kubenswrapper[4681]: > Oct 07 18:22:00 crc kubenswrapper[4681]: I1007 18:22:00.526719 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:22:00 crc kubenswrapper[4681]: I1007 18:22:00.593080 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:22:00 crc kubenswrapper[4681]: I1007 18:22:00.773367 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.177541 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qmcj7" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="registry-server" containerID="cri-o://21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346" gracePeriod=2 Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.647618 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.822336 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities\") pod \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.822592 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content\") pod \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.822637 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mltk\" (UniqueName: \"kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk\") pod \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\" (UID: \"c4751be0-208d-46d8-83c6-2fc35f3f9b24\") " Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.822920 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities" (OuterVolumeSpecName: "utilities") pod "c4751be0-208d-46d8-83c6-2fc35f3f9b24" (UID: "c4751be0-208d-46d8-83c6-2fc35f3f9b24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.823365 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.829736 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk" (OuterVolumeSpecName: "kube-api-access-6mltk") pod "c4751be0-208d-46d8-83c6-2fc35f3f9b24" (UID: "c4751be0-208d-46d8-83c6-2fc35f3f9b24"). InnerVolumeSpecName "kube-api-access-6mltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.921536 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4751be0-208d-46d8-83c6-2fc35f3f9b24" (UID: "c4751be0-208d-46d8-83c6-2fc35f3f9b24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.925685 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4751be0-208d-46d8-83c6-2fc35f3f9b24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:22:02 crc kubenswrapper[4681]: I1007 18:22:02.925728 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mltk\" (UniqueName: \"kubernetes.io/projected/c4751be0-208d-46d8-83c6-2fc35f3f9b24-kube-api-access-6mltk\") on node \"crc\" DevicePath \"\"" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.182072 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6zl9n_dd5794df-cde0-4881-921f-9ba7006d4281/control-plane-machine-set-operator/0.log" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.192560 4681 generic.go:334] "Generic (PLEG): container finished" podID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerID="21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346" exitCode=0 Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.192607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerDied","Data":"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346"} Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.192633 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmcj7" event={"ID":"c4751be0-208d-46d8-83c6-2fc35f3f9b24","Type":"ContainerDied","Data":"a91787c1b60d79c31a38809cc6cad7b07368e1eb0ad4ff041d8f6dedf8eaf039"} Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.192651 4681 scope.go:117] "RemoveContainer" containerID="21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.192773 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmcj7" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.216197 4681 scope.go:117] "RemoveContainer" containerID="4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.220990 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.231698 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qmcj7"] Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.241567 4681 scope.go:117] "RemoveContainer" containerID="2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.286182 4681 scope.go:117] "RemoveContainer" containerID="21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346" Oct 07 18:22:03 crc kubenswrapper[4681]: E1007 18:22:03.286693 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346\": container with ID starting with 21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346 not found: ID does not exist" containerID="21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.286758 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346"} err="failed to get container status \"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346\": rpc error: code = NotFound desc = could not find container \"21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346\": container with ID starting with 21f0cb9c764e26b4c5876824c9f87ba55d0abc3e5bc06287400e56bc1fcac346 not found: ID does not exist" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.286783 4681 scope.go:117] "RemoveContainer" containerID="4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c" Oct 07 18:22:03 crc kubenswrapper[4681]: E1007 18:22:03.287026 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c\": container with ID starting with 4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c not found: ID does not exist" containerID="4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.287049 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c"} err="failed to get container status \"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c\": rpc error: code = NotFound desc = could not find container \"4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c\": container with ID starting with 4a020753b5ea0009a34fb792d038f821950595e3ce3b58aca034fd90b30d3b4c not found: ID does not exist" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.287063 4681 scope.go:117] "RemoveContainer" containerID="2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66" Oct 07 18:22:03 crc kubenswrapper[4681]: E1007 18:22:03.292080 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66\": container with ID starting with 2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66 not found: ID does not exist" containerID="2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.292185 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66"} err="failed to get container status \"2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66\": rpc error: code = NotFound desc = could not find container \"2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66\": container with ID starting with 2a82dc07dcb5176f71c12d66b920b9f6783708e2881a65bcf6268fd687132a66 not found: ID does not exist" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.420317 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tg8wr_ba59400b-2ce1-489d-a70d-747f23b176c6/machine-api-operator/0.log" Oct 07 18:22:03 crc kubenswrapper[4681]: I1007 18:22:03.448176 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tg8wr_ba59400b-2ce1-489d-a70d-747f23b176c6/kube-rbac-proxy/0.log" Oct 07 18:22:05 crc kubenswrapper[4681]: I1007 18:22:05.041157 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" path="/var/lib/kubelet/pods/c4751be0-208d-46d8-83c6-2fc35f3f9b24/volumes" Oct 07 18:22:12 crc kubenswrapper[4681]: I1007 18:22:12.195782 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:22:12 crc kubenswrapper[4681]: I1007 18:22:12.196366 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:22:15 crc kubenswrapper[4681]: I1007 18:22:15.503365 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wvq66_b603cb8d-41a5-4537-95da-d2e4fa39ce75/cert-manager-cainjector/0.log" Oct 07 18:22:15 crc kubenswrapper[4681]: I1007 18:22:15.507053 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-5qxkp_e132d85f-c498-41eb-a780-be92455331bb/cert-manager-controller/0.log" Oct 07 18:22:15 crc kubenswrapper[4681]: I1007 18:22:15.712057 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-djw5h_a8a15de5-2d99-41a6-b4c9-7d31c28413b2/cert-manager-webhook/0.log" Oct 07 18:22:29 crc kubenswrapper[4681]: I1007 18:22:29.627565 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-klc4h_d8e7f096-d849-4c9d-8338-2117a554f2de/nmstate-console-plugin/0.log" Oct 07 18:22:29 crc kubenswrapper[4681]: I1007 18:22:29.950445 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lc2dx_e37df9f5-e512-4e43-9c96-c193553b43dd/nmstate-handler/0.log" Oct 07 18:22:30 crc kubenswrapper[4681]: I1007 18:22:30.013177 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vkkc9_abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7/nmstate-metrics/0.log" Oct 07 18:22:30 crc kubenswrapper[4681]: I1007 18:22:30.073833 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vkkc9_abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7/kube-rbac-proxy/0.log" Oct 07 18:22:30 crc kubenswrapper[4681]: I1007 18:22:30.263171 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-2xfw7_d77c5294-44ad-4618-abf8-143fb7872315/nmstate-operator/0.log" Oct 07 18:22:30 crc kubenswrapper[4681]: I1007 18:22:30.328915 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-7zfgs_64eeb4ec-129d-4fc8-be68-138e9c28cd3c/nmstate-webhook/0.log" Oct 07 18:22:42 crc kubenswrapper[4681]: I1007 18:22:42.195137 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:22:42 crc kubenswrapper[4681]: I1007 18:22:42.195633 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:22:46 crc kubenswrapper[4681]: I1007 18:22:46.302231 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xszcp_ec498aeb-7c28-4e30-adee-e4546d01d498/controller/0.log" Oct 07 18:22:46 crc kubenswrapper[4681]: I1007 18:22:46.327913 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xszcp_ec498aeb-7c28-4e30-adee-e4546d01d498/kube-rbac-proxy/0.log" Oct 07 18:22:46 crc kubenswrapper[4681]: I1007 18:22:46.921211 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.110134 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.142318 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.159632 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.188114 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.407643 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.450214 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.458213 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.469492 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.657676 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.705396 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.756415 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.786916 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/controller/0.log" Oct 07 18:22:47 crc kubenswrapper[4681]: I1007 18:22:47.960321 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/frr-metrics/0.log" Oct 07 18:22:48 crc kubenswrapper[4681]: I1007 18:22:48.019714 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/kube-rbac-proxy/0.log" Oct 07 18:22:48 crc kubenswrapper[4681]: I1007 18:22:48.365053 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/kube-rbac-proxy-frr/0.log" Oct 07 18:22:48 crc kubenswrapper[4681]: I1007 18:22:48.419966 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/reloader/0.log" Oct 07 18:22:48 crc kubenswrapper[4681]: I1007 18:22:48.930156 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zcsl2_0227af93-e3dc-47c9-b6ce-57d25fc998ea/frr-k8s-webhook-server/0.log" Oct 07 18:22:49 crc kubenswrapper[4681]: I1007 18:22:49.055265 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-598476574-wb9sj_abb48906-478a-4687-9e03-76d9035242b8/manager/0.log" Oct 07 18:22:49 crc kubenswrapper[4681]: I1007 18:22:49.213993 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5996f7f8c8-n6prk_350e1b80-5296-4a5b-a604-e9a42b56cbd1/webhook-server/0.log" Oct 07 18:22:49 crc kubenswrapper[4681]: I1007 18:22:49.215576 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/frr/0.log" Oct 07 18:22:49 crc kubenswrapper[4681]: I1007 18:22:49.497342 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nzg7f_3f34c830-b1bc-433a-af20-0db4f0d96394/kube-rbac-proxy/0.log" Oct 07 18:22:49 crc kubenswrapper[4681]: I1007 18:22:49.676473 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nzg7f_3f34c830-b1bc-433a-af20-0db4f0d96394/speaker/0.log" Oct 07 18:23:01 crc kubenswrapper[4681]: I1007 18:23:01.901003 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.058382 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.109630 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.137871 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.286625 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.312365 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.351687 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/extract/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.496677 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.662309 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.733446 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.733623 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.904271 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:23:02 crc kubenswrapper[4681]: I1007 18:23:02.904378 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.089787 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/registry-server/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.179401 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.345739 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.363478 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.377305 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.543958 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.605654 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:23:03 crc kubenswrapper[4681]: I1007 18:23:03.851000 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.529153 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/registry-server/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.606846 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.666980 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.671566 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.789142 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.871129 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/extract/0.log" Oct 07 18:23:04 crc kubenswrapper[4681]: I1007 18:23:04.879961 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.026822 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kbm6c_4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9/marketplace-operator/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.083620 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.314834 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.341082 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.352364 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.536042 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.551268 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:23:05 crc kubenswrapper[4681]: I1007 18:23:05.712503 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/registry-server/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.051060 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.218011 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.259971 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.271265 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.411679 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.466512 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:23:06 crc kubenswrapper[4681]: I1007 18:23:06.857602 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/registry-server/0.log" Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.195396 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.195891 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.195943 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.196668 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.196716 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27" gracePeriod=600 Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.766254 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27" exitCode=0 Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.766309 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27"} Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.766606 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822"} Oct 07 18:23:12 crc kubenswrapper[4681]: I1007 18:23:12.766631 4681 scope.go:117] "RemoveContainer" containerID="04f129d65fafd7c372a17eba2498d95d7f67db21382a6e8024cfd2f3f6db5c24" Oct 07 18:23:41 crc kubenswrapper[4681]: E1007 18:23:41.966242 4681 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.93:50186->38.129.56.93:44823: read tcp 38.129.56.93:50186->38.129.56.93:44823: read: connection reset by peer Oct 07 18:23:41 crc kubenswrapper[4681]: E1007 18:23:41.966926 4681 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.93:50186->38.129.56.93:44823: write tcp 38.129.56.93:50186->38.129.56.93:44823: write: broken pipe Oct 07 18:25:12 crc kubenswrapper[4681]: I1007 18:25:12.194945 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:25:12 crc kubenswrapper[4681]: I1007 18:25:12.195469 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.352104 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:29 crc kubenswrapper[4681]: E1007 18:25:29.354283 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="extract-content" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.354403 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="extract-content" Oct 07 18:25:29 crc kubenswrapper[4681]: E1007 18:25:29.354499 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="extract-utilities" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.354583 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="extract-utilities" Oct 07 18:25:29 crc kubenswrapper[4681]: E1007 18:25:29.354664 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="registry-server" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.354763 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="registry-server" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.355141 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4751be0-208d-46d8-83c6-2fc35f3f9b24" containerName="registry-server" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.356871 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.364602 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.370697 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722zp\" (UniqueName: \"kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.370934 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.371014 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.472394 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-722zp\" (UniqueName: \"kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.472470 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.472499 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.472998 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.473202 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.504611 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-722zp\" (UniqueName: \"kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp\") pod \"redhat-marketplace-x29c5\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:29 crc kubenswrapper[4681]: I1007 18:25:29.693265 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:30 crc kubenswrapper[4681]: I1007 18:25:30.238372 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:31 crc kubenswrapper[4681]: I1007 18:25:31.051232 4681 generic.go:334] "Generic (PLEG): container finished" podID="f307858b-654f-4800-977b-6f209f3bb3cf" containerID="467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f" exitCode=0 Oct 07 18:25:31 crc kubenswrapper[4681]: I1007 18:25:31.054172 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 18:25:31 crc kubenswrapper[4681]: I1007 18:25:31.058475 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerDied","Data":"467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f"} Oct 07 18:25:31 crc kubenswrapper[4681]: I1007 18:25:31.058530 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerStarted","Data":"2e54f57b4d2db044070dad43e5bef4d8a0323b5b6f46c84ecda974b533ef116e"} Oct 07 18:25:33 crc kubenswrapper[4681]: I1007 18:25:33.072857 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerStarted","Data":"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174"} Oct 07 18:25:34 crc kubenswrapper[4681]: I1007 18:25:34.085368 4681 generic.go:334] "Generic (PLEG): container finished" podID="f307858b-654f-4800-977b-6f209f3bb3cf" containerID="6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174" exitCode=0 Oct 07 18:25:34 crc kubenswrapper[4681]: I1007 18:25:34.085552 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerDied","Data":"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174"} Oct 07 18:25:35 crc kubenswrapper[4681]: I1007 18:25:35.098780 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerStarted","Data":"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b"} Oct 07 18:25:35 crc kubenswrapper[4681]: I1007 18:25:35.119693 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x29c5" podStartSLOduration=2.274542654 podStartE2EDuration="6.119676934s" podCreationTimestamp="2025-10-07 18:25:29 +0000 UTC" firstStartedPulling="2025-10-07 18:25:31.053860967 +0000 UTC m=+4934.701272522" lastFinishedPulling="2025-10-07 18:25:34.898995247 +0000 UTC m=+4938.546406802" observedRunningTime="2025-10-07 18:25:35.117662648 +0000 UTC m=+4938.765074203" watchObservedRunningTime="2025-10-07 18:25:35.119676934 +0000 UTC m=+4938.767088489" Oct 07 18:25:39 crc kubenswrapper[4681]: I1007 18:25:39.693984 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:39 crc kubenswrapper[4681]: I1007 18:25:39.694500 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:39 crc kubenswrapper[4681]: I1007 18:25:39.738337 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:40 crc kubenswrapper[4681]: I1007 18:25:40.190381 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:40 crc kubenswrapper[4681]: I1007 18:25:40.237163 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:41 crc kubenswrapper[4681]: I1007 18:25:41.149267 4681 generic.go:334] "Generic (PLEG): container finished" podID="196867f6-ac14-4388-abf3-d184f19deffb" containerID="dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08" exitCode=0 Oct 07 18:25:41 crc kubenswrapper[4681]: I1007 18:25:41.149359 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8xmpv/must-gather-p95jd" event={"ID":"196867f6-ac14-4388-abf3-d184f19deffb","Type":"ContainerDied","Data":"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08"} Oct 07 18:25:41 crc kubenswrapper[4681]: I1007 18:25:41.150416 4681 scope.go:117] "RemoveContainer" containerID="dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08" Oct 07 18:25:41 crc kubenswrapper[4681]: I1007 18:25:41.975032 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xmpv_must-gather-p95jd_196867f6-ac14-4388-abf3-d184f19deffb/gather/0.log" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.159633 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x29c5" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="registry-server" containerID="cri-o://59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b" gracePeriod=2 Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.194732 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.194781 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.738018 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.820105 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities\") pod \"f307858b-654f-4800-977b-6f209f3bb3cf\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.820858 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content\") pod \"f307858b-654f-4800-977b-6f209f3bb3cf\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.821000 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-722zp\" (UniqueName: \"kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp\") pod \"f307858b-654f-4800-977b-6f209f3bb3cf\" (UID: \"f307858b-654f-4800-977b-6f209f3bb3cf\") " Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.821333 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities" (OuterVolumeSpecName: "utilities") pod "f307858b-654f-4800-977b-6f209f3bb3cf" (UID: "f307858b-654f-4800-977b-6f209f3bb3cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.822040 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.835566 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp" (OuterVolumeSpecName: "kube-api-access-722zp") pod "f307858b-654f-4800-977b-6f209f3bb3cf" (UID: "f307858b-654f-4800-977b-6f209f3bb3cf"). InnerVolumeSpecName "kube-api-access-722zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.849527 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f307858b-654f-4800-977b-6f209f3bb3cf" (UID: "f307858b-654f-4800-977b-6f209f3bb3cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.925153 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f307858b-654f-4800-977b-6f209f3bb3cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:25:42 crc kubenswrapper[4681]: I1007 18:25:42.925187 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-722zp\" (UniqueName: \"kubernetes.io/projected/f307858b-654f-4800-977b-6f209f3bb3cf-kube-api-access-722zp\") on node \"crc\" DevicePath \"\"" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.169110 4681 generic.go:334] "Generic (PLEG): container finished" podID="f307858b-654f-4800-977b-6f209f3bb3cf" containerID="59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b" exitCode=0 Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.169171 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerDied","Data":"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b"} Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.169201 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x29c5" event={"ID":"f307858b-654f-4800-977b-6f209f3bb3cf","Type":"ContainerDied","Data":"2e54f57b4d2db044070dad43e5bef4d8a0323b5b6f46c84ecda974b533ef116e"} Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.169203 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x29c5" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.169223 4681 scope.go:117] "RemoveContainer" containerID="59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.195664 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.197789 4681 scope.go:117] "RemoveContainer" containerID="6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.203757 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x29c5"] Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.214954 4681 scope.go:117] "RemoveContainer" containerID="467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.257042 4681 scope.go:117] "RemoveContainer" containerID="59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b" Oct 07 18:25:43 crc kubenswrapper[4681]: E1007 18:25:43.258266 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b\": container with ID starting with 59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b not found: ID does not exist" containerID="59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.258299 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b"} err="failed to get container status \"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b\": rpc error: code = NotFound desc = could not find container \"59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b\": container with ID starting with 59d7e69d577b47746e66d2f47854104204f48c30e89636219ca41db50c72bf0b not found: ID does not exist" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.258320 4681 scope.go:117] "RemoveContainer" containerID="6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174" Oct 07 18:25:43 crc kubenswrapper[4681]: E1007 18:25:43.259052 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174\": container with ID starting with 6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174 not found: ID does not exist" containerID="6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.259094 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174"} err="failed to get container status \"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174\": rpc error: code = NotFound desc = could not find container \"6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174\": container with ID starting with 6979f13ee6ad5d3f94826203b5bd04083267eb5c3e316f9fbac7cf434936d174 not found: ID does not exist" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.259136 4681 scope.go:117] "RemoveContainer" containerID="467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f" Oct 07 18:25:43 crc kubenswrapper[4681]: E1007 18:25:43.259535 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f\": container with ID starting with 467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f not found: ID does not exist" containerID="467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f" Oct 07 18:25:43 crc kubenswrapper[4681]: I1007 18:25:43.259574 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f"} err="failed to get container status \"467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f\": rpc error: code = NotFound desc = could not find container \"467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f\": container with ID starting with 467b7ca8f432952bc2a9a12a5d096af218d4a1d63d3c448867ddb81add318b0f not found: ID does not exist" Oct 07 18:25:45 crc kubenswrapper[4681]: I1007 18:25:45.043317 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" path="/var/lib/kubelet/pods/f307858b-654f-4800-977b-6f209f3bb3cf/volumes" Oct 07 18:25:51 crc kubenswrapper[4681]: I1007 18:25:51.626810 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8xmpv/must-gather-p95jd"] Oct 07 18:25:51 crc kubenswrapper[4681]: I1007 18:25:51.627630 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8xmpv/must-gather-p95jd" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="copy" containerID="cri-o://fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc" gracePeriod=2 Oct 07 18:25:51 crc kubenswrapper[4681]: I1007 18:25:51.638588 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8xmpv/must-gather-p95jd"] Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.043040 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xmpv_must-gather-p95jd_196867f6-ac14-4388-abf3-d184f19deffb/copy/0.log" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.043613 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.115242 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output\") pod \"196867f6-ac14-4388-abf3-d184f19deffb\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.115292 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bdxw\" (UniqueName: \"kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw\") pod \"196867f6-ac14-4388-abf3-d184f19deffb\" (UID: \"196867f6-ac14-4388-abf3-d184f19deffb\") " Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.122767 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw" (OuterVolumeSpecName: "kube-api-access-9bdxw") pod "196867f6-ac14-4388-abf3-d184f19deffb" (UID: "196867f6-ac14-4388-abf3-d184f19deffb"). InnerVolumeSpecName "kube-api-access-9bdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.218015 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bdxw\" (UniqueName: \"kubernetes.io/projected/196867f6-ac14-4388-abf3-d184f19deffb-kube-api-access-9bdxw\") on node \"crc\" DevicePath \"\"" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.254611 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8xmpv_must-gather-p95jd_196867f6-ac14-4388-abf3-d184f19deffb/copy/0.log" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.255014 4681 generic.go:334] "Generic (PLEG): container finished" podID="196867f6-ac14-4388-abf3-d184f19deffb" containerID="fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc" exitCode=143 Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.255068 4681 scope.go:117] "RemoveContainer" containerID="fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.255191 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8xmpv/must-gather-p95jd" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.277554 4681 scope.go:117] "RemoveContainer" containerID="dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.294034 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "196867f6-ac14-4388-abf3-d184f19deffb" (UID: "196867f6-ac14-4388-abf3-d184f19deffb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.320188 4681 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/196867f6-ac14-4388-abf3-d184f19deffb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.724714 4681 scope.go:117] "RemoveContainer" containerID="fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc" Oct 07 18:25:52 crc kubenswrapper[4681]: E1007 18:25:52.731174 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc\": container with ID starting with fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc not found: ID does not exist" containerID="fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.731231 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc"} err="failed to get container status \"fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc\": rpc error: code = NotFound desc = could not find container \"fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc\": container with ID starting with fd399fca63690249c1665c4eae5625a1e4b549af77103b63557052bc2d6163bc not found: ID does not exist" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.731264 4681 scope.go:117] "RemoveContainer" containerID="dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08" Oct 07 18:25:52 crc kubenswrapper[4681]: E1007 18:25:52.731677 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08\": container with ID starting with dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08 not found: ID does not exist" containerID="dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08" Oct 07 18:25:52 crc kubenswrapper[4681]: I1007 18:25:52.731702 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08"} err="failed to get container status \"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08\": rpc error: code = NotFound desc = could not find container \"dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08\": container with ID starting with dea2bd95c4c4b5414bb4dd0788d593c86871fda10f49dad79a8f76c73c30ff08 not found: ID does not exist" Oct 07 18:25:53 crc kubenswrapper[4681]: I1007 18:25:53.038976 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196867f6-ac14-4388-abf3-d184f19deffb" path="/var/lib/kubelet/pods/196867f6-ac14-4388-abf3-d184f19deffb/volumes" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.195206 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.195733 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.195782 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.196626 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.196680 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" gracePeriod=600 Oct 07 18:26:12 crc kubenswrapper[4681]: E1007 18:26:12.328022 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.426795 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" exitCode=0 Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.426864 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822"} Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.426947 4681 scope.go:117] "RemoveContainer" containerID="643d746b61b3eb3b97412f52fbef0c2236e197f641a33ea27810ca402e822c27" Oct 07 18:26:12 crc kubenswrapper[4681]: I1007 18:26:12.428406 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:26:12 crc kubenswrapper[4681]: E1007 18:26:12.429211 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:26:25 crc kubenswrapper[4681]: I1007 18:26:25.030134 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:26:25 crc kubenswrapper[4681]: E1007 18:26:25.031129 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.137362 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jwxmq/must-gather-hgbpv"] Oct 07 18:26:32 crc kubenswrapper[4681]: E1007 18:26:32.138164 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="extract-content" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138176 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="extract-content" Oct 07 18:26:32 crc kubenswrapper[4681]: E1007 18:26:32.138197 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="copy" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138203 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="copy" Oct 07 18:26:32 crc kubenswrapper[4681]: E1007 18:26:32.138225 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="registry-server" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138231 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="registry-server" Oct 07 18:26:32 crc kubenswrapper[4681]: E1007 18:26:32.138249 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="extract-utilities" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138254 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="extract-utilities" Oct 07 18:26:32 crc kubenswrapper[4681]: E1007 18:26:32.138272 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="gather" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138277 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="gather" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138451 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="gather" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138460 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="196867f6-ac14-4388-abf3-d184f19deffb" containerName="copy" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.138486 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f307858b-654f-4800-977b-6f209f3bb3cf" containerName="registry-server" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.139564 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.143722 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jwxmq"/"kube-root-ca.crt" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.143972 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jwxmq"/"openshift-service-ca.crt" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.177460 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jwxmq/must-gather-hgbpv"] Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.273041 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.273096 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdjg\" (UniqueName: \"kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.374791 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.374856 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdjg\" (UniqueName: \"kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.375473 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.403580 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdjg\" (UniqueName: \"kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg\") pod \"must-gather-hgbpv\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.457329 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:26:32 crc kubenswrapper[4681]: I1007 18:26:32.971468 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jwxmq/must-gather-hgbpv"] Oct 07 18:26:33 crc kubenswrapper[4681]: I1007 18:26:33.612833 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" event={"ID":"67296e11-c9cb-4eb7-bcf7-26822676668b","Type":"ContainerStarted","Data":"30fd71733d1c132f18cfb3e4dea15a34fc30918c31ee8392fc87040ee459cb53"} Oct 07 18:26:33 crc kubenswrapper[4681]: I1007 18:26:33.613213 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" event={"ID":"67296e11-c9cb-4eb7-bcf7-26822676668b","Type":"ContainerStarted","Data":"a9b6007036794d95dff3894fdc629aaa2a7846d0d89a3c7bfe2ab6bf0ce5a983"} Oct 07 18:26:33 crc kubenswrapper[4681]: I1007 18:26:33.613228 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" event={"ID":"67296e11-c9cb-4eb7-bcf7-26822676668b","Type":"ContainerStarted","Data":"02d5562119ff572f4d9978d26e9293bbf78769943a5134afa2c2af6e63853bea"} Oct 07 18:26:33 crc kubenswrapper[4681]: I1007 18:26:33.635553 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" podStartSLOduration=1.635537678 podStartE2EDuration="1.635537678s" podCreationTimestamp="2025-10-07 18:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:26:33.632842283 +0000 UTC m=+4997.280253838" watchObservedRunningTime="2025-10-07 18:26:33.635537678 +0000 UTC m=+4997.282949223" Oct 07 18:26:36 crc kubenswrapper[4681]: I1007 18:26:36.029789 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:26:36 crc kubenswrapper[4681]: E1007 18:26:36.030359 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.132154 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lhb2f"] Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.133590 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.137239 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jwxmq"/"default-dockercfg-whdxj" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.271042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.271267 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tnz9\" (UniqueName: \"kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.373040 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.373124 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tnz9\" (UniqueName: \"kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.373180 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.411170 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tnz9\" (UniqueName: \"kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9\") pod \"crc-debug-lhb2f\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.452257 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:26:37 crc kubenswrapper[4681]: W1007 18:26:37.492145 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a06a640_3954_4a6e_8b4a_0d28ba62296a.slice/crio-6bce0d864c1d528738219283cfd36eea232e9bfee41122d896a2f7ac9c2f20a5 WatchSource:0}: Error finding container 6bce0d864c1d528738219283cfd36eea232e9bfee41122d896a2f7ac9c2f20a5: Status 404 returned error can't find the container with id 6bce0d864c1d528738219283cfd36eea232e9bfee41122d896a2f7ac9c2f20a5 Oct 07 18:26:37 crc kubenswrapper[4681]: I1007 18:26:37.646352 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" event={"ID":"8a06a640-3954-4a6e-8b4a-0d28ba62296a","Type":"ContainerStarted","Data":"6bce0d864c1d528738219283cfd36eea232e9bfee41122d896a2f7ac9c2f20a5"} Oct 07 18:26:38 crc kubenswrapper[4681]: I1007 18:26:38.656773 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" event={"ID":"8a06a640-3954-4a6e-8b4a-0d28ba62296a","Type":"ContainerStarted","Data":"257f67f8a991d37d096cd20dbcd01a708837d13ee11d1fae9c7016f45616e69b"} Oct 07 18:26:38 crc kubenswrapper[4681]: I1007 18:26:38.690128 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" podStartSLOduration=1.690105113 podStartE2EDuration="1.690105113s" podCreationTimestamp="2025-10-07 18:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:26:38.683974282 +0000 UTC m=+5002.331385837" watchObservedRunningTime="2025-10-07 18:26:38.690105113 +0000 UTC m=+5002.337516668" Oct 07 18:26:50 crc kubenswrapper[4681]: I1007 18:26:50.029225 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:26:50 crc kubenswrapper[4681]: E1007 18:26:50.029981 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:27:05 crc kubenswrapper[4681]: I1007 18:27:05.029195 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:27:05 crc kubenswrapper[4681]: E1007 18:27:05.029954 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:27:18 crc kubenswrapper[4681]: I1007 18:27:18.029319 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:27:18 crc kubenswrapper[4681]: E1007 18:27:18.030035 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:27:32 crc kubenswrapper[4681]: I1007 18:27:32.030479 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:27:32 crc kubenswrapper[4681]: E1007 18:27:32.031307 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:27:37 crc kubenswrapper[4681]: I1007 18:27:37.576828 4681 scope.go:117] "RemoveContainer" containerID="e81f6d20256dff64f37afa30652e7fb518e875c5de3e690866aa62be9749171e" Oct 07 18:27:45 crc kubenswrapper[4681]: I1007 18:27:45.029996 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:27:45 crc kubenswrapper[4681]: E1007 18:27:45.030705 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:00 crc kubenswrapper[4681]: I1007 18:28:00.034170 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:28:00 crc kubenswrapper[4681]: E1007 18:28:00.034922 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:13 crc kubenswrapper[4681]: I1007 18:28:13.029518 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:28:13 crc kubenswrapper[4681]: E1007 18:28:13.030180 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.217196 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d8b9fbb46-6wjkq_07f40489-1614-45c8-864b-2288473c7c1d/barbican-api/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.222111 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d8b9fbb46-6wjkq_07f40489-1614-45c8-864b-2288473c7c1d/barbican-api-log/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.439312 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d869d8764-5bjtz_f35c1eb1-692d-4484-a686-5ad0ce63744b/barbican-keystone-listener/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.506713 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d869d8764-5bjtz_f35c1eb1-692d-4484-a686-5ad0ce63744b/barbican-keystone-listener-log/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.659242 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57b57fb795-6426k_62d6d4e2-d1d4-4967-82e9-143266e1165b/barbican-worker/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.717014 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57b57fb795-6426k_62d6d4e2-d1d4-4967-82e9-143266e1165b/barbican-worker-log/0.log" Oct 07 18:28:14 crc kubenswrapper[4681]: I1007 18:28:14.941457 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5jg6l_5da1ef34-103f-4687-8454-89abe7b61f54/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.223108 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/ceilometer-notification-agent/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.228052 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/ceilometer-central-agent/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.237159 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/proxy-httpd/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.394984 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c8863ad2-0fce-42cc-aae0-cd51fe7a79ab/sg-core/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.498161 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7/cinder-api/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.655589 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6d12fb3d-a5e9-450f-a6c5-abde4bb79bc7/cinder-api-log/0.log" Oct 07 18:28:15 crc kubenswrapper[4681]: I1007 18:28:15.823896 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9/cinder-scheduler/0.log" Oct 07 18:28:16 crc kubenswrapper[4681]: I1007 18:28:16.017351 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_541bd8e0-6cd1-4d9c-8b12-6c5f36e57db9/probe/0.log" Oct 07 18:28:16 crc kubenswrapper[4681]: I1007 18:28:16.192509 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lqzjp_44ed5213-33ec-47cd-bc96-8d536fa86f61/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:16 crc kubenswrapper[4681]: I1007 18:28:16.334780 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-d494h_fea47565-ef99-4b31-869a-075d2d8331e9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:16 crc kubenswrapper[4681]: I1007 18:28:16.583718 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jktwm_13708146-56fd-426d-988d-d6e66d01cadb/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:16 crc kubenswrapper[4681]: I1007 18:28:16.719265 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/init/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.021535 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/init/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.189796 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zsjhj_a5b5bb10-eaaa-410b-8040-c9b15d4c0e62/dnsmasq-dns/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.342188 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zp8s7_41bd87d5-77d6-4866-b9b8-aaed777393b5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.473479 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dbe731b8-1f1d-449c-accb-3cb97696d1ae/glance-httpd/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.636725 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_dbe731b8-1f1d-449c-accb-3cb97696d1ae/glance-log/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.749117 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1469d2bd-93c0-414a-951e-175bc73f377e/glance-httpd/0.log" Oct 07 18:28:17 crc kubenswrapper[4681]: I1007 18:28:17.884908 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1469d2bd-93c0-414a-951e-175bc73f377e/glance-log/0.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.079846 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon/2.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.341076 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon/1.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.463896 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tb6hg_07a9584a-a546-4ec3-ba13-1f0db8c3ba39/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.614837 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-9qxmf_51176e78-6a59-4fe2-abc5-88a3177b9ee0/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.670784 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f945f854d-hm49c_02a91326-9285-4589-a05b-c0a2c2ed397e/horizon-log/0.log" Oct 07 18:28:18 crc kubenswrapper[4681]: I1007 18:28:18.932305 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29331001-dj57c_0205aec3-2b1b-427b-9359-40d4118c7f59/keystone-cron/0.log" Oct 07 18:28:19 crc kubenswrapper[4681]: I1007 18:28:19.204457 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_92e5095e-22e9-46b1-900a-492f827a05eb/kube-state-metrics/0.log" Oct 07 18:28:19 crc kubenswrapper[4681]: I1007 18:28:19.487906 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db8ffcf86-wnnfn_ea2f52b3-77b8-44c3-b6ca-ce1e70d93dbd/keystone-api/0.log" Oct 07 18:28:19 crc kubenswrapper[4681]: I1007 18:28:19.562956 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-djlbc_3c08afe2-1291-4ac9-8eb5-493f9cff1c4d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:20 crc kubenswrapper[4681]: I1007 18:28:20.379963 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b94d78545-dfdgb_c77522e8-d403-4227-9740-21dca2843c58/neutron-httpd/0.log" Oct 07 18:28:20 crc kubenswrapper[4681]: I1007 18:28:20.488678 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b94d78545-dfdgb_c77522e8-d403-4227-9740-21dca2843c58/neutron-api/0.log" Oct 07 18:28:20 crc kubenswrapper[4681]: I1007 18:28:20.641174 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zdgdw_d1f9c32e-011c-49a9-8319-4aeb852fa976/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:22 crc kubenswrapper[4681]: I1007 18:28:22.006978 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ef096ee9-933c-44da-a4b7-6cc5b62ecc49/nova-cell0-conductor-conductor/0.log" Oct 07 18:28:22 crc kubenswrapper[4681]: I1007 18:28:22.321221 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9241da9a-f1bd-4d93-bd72-f84e5dd85083/nova-api-log/0.log" Oct 07 18:28:22 crc kubenswrapper[4681]: I1007 18:28:22.941775 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a1765198-be66-424a-b57a-187a6b62c4bc/nova-cell1-conductor-conductor/0.log" Oct 07 18:28:23 crc kubenswrapper[4681]: I1007 18:28:23.018434 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9241da9a-f1bd-4d93-bd72-f84e5dd85083/nova-api-api/0.log" Oct 07 18:28:23 crc kubenswrapper[4681]: I1007 18:28:23.222041 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_48284d8c-6f51-4fa0-ae29-b933b93a2411/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 18:28:23 crc kubenswrapper[4681]: I1007 18:28:23.523761 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rk6x9_a7d237e9-d752-4244-8f32-be01a5ca3f6f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:23 crc kubenswrapper[4681]: I1007 18:28:23.637381 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e80aacf-4a39-48b9-96c3-692936cf2855/nova-metadata-log/0.log" Oct 07 18:28:24 crc kubenswrapper[4681]: I1007 18:28:24.627191 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/mysql-bootstrap/0.log" Oct 07 18:28:24 crc kubenswrapper[4681]: I1007 18:28:24.684328 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_51290795-4e81-4099-ab84-e9529128d78a/nova-scheduler-scheduler/0.log" Oct 07 18:28:24 crc kubenswrapper[4681]: I1007 18:28:24.955504 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/mysql-bootstrap/0.log" Oct 07 18:28:25 crc kubenswrapper[4681]: I1007 18:28:25.041510 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_61391679-2b8c-4be3-b3d7-bd2d3e667c15/galera/0.log" Oct 07 18:28:25 crc kubenswrapper[4681]: I1007 18:28:25.723105 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/mysql-bootstrap/0.log" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.029678 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:28:26 crc kubenswrapper[4681]: E1007 18:28:26.030160 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.094239 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/galera/0.log" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.099635 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7d261af7-bc67-4638-8b4c-1f7a7cb129a2/mysql-bootstrap/0.log" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.321454 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5e80aacf-4a39-48b9-96c3-692936cf2855/nova-metadata-metadata/0.log" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.416379 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a253ef31-4d02-4fbd-8842-cf2fbe41f307/openstackclient/0.log" Oct 07 18:28:26 crc kubenswrapper[4681]: I1007 18:28:26.687682 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-v4f4x_361da154-8a78-497d-9bb1-78335f5a286d/openstack-network-exporter/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.024271 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server-init/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.308297 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.340966 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovs-vswitchd/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.375754 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6tf88_6a172508-6850-4bf5-8e7f-6c6674c4a1ee/ovsdb-server-init/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.651152 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xhwkc_8be45f14-7feb-40fa-a0a8-919c6d8cd052/ovn-controller/0.log" Oct 07 18:28:27 crc kubenswrapper[4681]: I1007 18:28:27.935470 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wd6mz_eb92dd00-8b97-470f-9f2c-3ff1ee783f93/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:28 crc kubenswrapper[4681]: I1007 18:28:28.036271 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1209f82a-cbcc-4833-98f0-6e2a07b53aeb/openstack-network-exporter/0.log" Oct 07 18:28:28 crc kubenswrapper[4681]: I1007 18:28:28.313625 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1209f82a-cbcc-4833-98f0-6e2a07b53aeb/ovn-northd/0.log" Oct 07 18:28:28 crc kubenswrapper[4681]: I1007 18:28:28.376587 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9de0f04a-f2ed-48ee-a873-8a02b70fb146/openstack-network-exporter/0.log" Oct 07 18:28:28 crc kubenswrapper[4681]: I1007 18:28:28.705976 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9de0f04a-f2ed-48ee-a873-8a02b70fb146/ovsdbserver-nb/0.log" Oct 07 18:28:28 crc kubenswrapper[4681]: I1007 18:28:28.789999 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5d2debf-c5bb-47fa-9d33-69c2f549a3e0/openstack-network-exporter/0.log" Oct 07 18:28:29 crc kubenswrapper[4681]: I1007 18:28:29.077649 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d5d2debf-c5bb-47fa-9d33-69c2f549a3e0/ovsdbserver-sb/0.log" Oct 07 18:28:29 crc kubenswrapper[4681]: I1007 18:28:29.297581 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68578dd4f6-bzx29_0d112a4c-ca20-4593-ac26-4e88a56ca00a/placement-api/0.log" Oct 07 18:28:29 crc kubenswrapper[4681]: I1007 18:28:29.650418 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/setup-container/0.log" Oct 07 18:28:29 crc kubenswrapper[4681]: I1007 18:28:29.655551 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68578dd4f6-bzx29_0d112a4c-ca20-4593-ac26-4e88a56ca00a/placement-log/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.062085 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/setup-container/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.106568 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4222be9f-615b-431f-9285-c629a68426e0/rabbitmq/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.377115 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/setup-container/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.589040 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/setup-container/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.687667 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6b4aa12d-0e45-47d7-b279-e705aef9c323/rabbitmq/0.log" Oct 07 18:28:30 crc kubenswrapper[4681]: I1007 18:28:30.883343 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7j5zl_12f3d295-f471-4e4d-9884-3bf34dab377f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:31 crc kubenswrapper[4681]: I1007 18:28:31.158858 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hmgls_0cdf5aef-8c9a-4cd5-8f38-2f368fe245df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:31 crc kubenswrapper[4681]: I1007 18:28:31.584685 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jkgjj_a1d74e17-5142-40f0-9847-0f9ee5e33f90/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:31 crc kubenswrapper[4681]: I1007 18:28:31.710335 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rp758_da2c6bbc-c4e1-4767-8815-fbc4cada002a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:31 crc kubenswrapper[4681]: I1007 18:28:31.984267 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qlsmk_4022d381-ce12-4c86-9368-4089026a66d3/ssh-known-hosts-edpm-deployment/0.log" Oct 07 18:28:32 crc kubenswrapper[4681]: I1007 18:28:32.643349 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58b7954b47-8j9j9_642b1a07-3c90-40b5-b6cb-af1d8832649b/proxy-server/0.log" Oct 07 18:28:32 crc kubenswrapper[4681]: I1007 18:28:32.774305 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58b7954b47-8j9j9_642b1a07-3c90-40b5-b6cb-af1d8832649b/proxy-httpd/0.log" Oct 07 18:28:32 crc kubenswrapper[4681]: I1007 18:28:32.995784 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-npftw_51ee2d02-f1ea-4e04-817a-c08925a2078d/swift-ring-rebalance/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.268947 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-auditor/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.312485 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-reaper/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.440635 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-replicator/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.558697 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/account-server/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.638913 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-auditor/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.817728 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-replicator/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.945412 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-server/0.log" Oct 07 18:28:33 crc kubenswrapper[4681]: I1007 18:28:33.955281 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/container-updater/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.239792 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-auditor/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.295011 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-replicator/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.326985 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-expirer/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.582464 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-server/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.606484 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/object-updater/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.730825 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/rsync/0.log" Oct 07 18:28:34 crc kubenswrapper[4681]: I1007 18:28:34.786648 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e111df37-d4f7-4dc5-ad9a-04b05519309a/swift-recon-cron/0.log" Oct 07 18:28:35 crc kubenswrapper[4681]: I1007 18:28:35.193955 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rbskw_650f08d2-bbd6-4cf7-b8d1-5923a4075672/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:35 crc kubenswrapper[4681]: I1007 18:28:35.332211 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_01a2ae55-90f7-432a-bc03-aedd6db91210/tempest-tests-tempest-tests-runner/0.log" Oct 07 18:28:35 crc kubenswrapper[4681]: I1007 18:28:35.550858 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_27c574e9-b637-4326-853e-f298321f1a1b/test-operator-logs-container/0.log" Oct 07 18:28:35 crc kubenswrapper[4681]: I1007 18:28:35.833243 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-49wz2_d3ee9809-6f86-44fb-9b11-163437e7750e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 18:28:40 crc kubenswrapper[4681]: I1007 18:28:40.033114 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:28:40 crc kubenswrapper[4681]: E1007 18:28:40.033607 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:44 crc kubenswrapper[4681]: I1007 18:28:44.665519 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1aa4b182-3cf3-4e5d-b59d-5e00004cb912/memcached/0.log" Oct 07 18:28:51 crc kubenswrapper[4681]: I1007 18:28:51.029349 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:28:51 crc kubenswrapper[4681]: E1007 18:28:51.030061 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:28:55 crc kubenswrapper[4681]: I1007 18:28:55.921805 4681 generic.go:334] "Generic (PLEG): container finished" podID="8a06a640-3954-4a6e-8b4a-0d28ba62296a" containerID="257f67f8a991d37d096cd20dbcd01a708837d13ee11d1fae9c7016f45616e69b" exitCode=0 Oct 07 18:28:55 crc kubenswrapper[4681]: I1007 18:28:55.922299 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" event={"ID":"8a06a640-3954-4a6e-8b4a-0d28ba62296a","Type":"ContainerDied","Data":"257f67f8a991d37d096cd20dbcd01a708837d13ee11d1fae9c7016f45616e69b"} Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.026896 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.066549 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lhb2f"] Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.073339 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lhb2f"] Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.112932 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tnz9\" (UniqueName: \"kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9\") pod \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.113144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host\") pod \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\" (UID: \"8a06a640-3954-4a6e-8b4a-0d28ba62296a\") " Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.113337 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host" (OuterVolumeSpecName: "host") pod "8a06a640-3954-4a6e-8b4a-0d28ba62296a" (UID: "8a06a640-3954-4a6e-8b4a-0d28ba62296a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.113601 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a06a640-3954-4a6e-8b4a-0d28ba62296a-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.119376 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9" (OuterVolumeSpecName: "kube-api-access-7tnz9") pod "8a06a640-3954-4a6e-8b4a-0d28ba62296a" (UID: "8a06a640-3954-4a6e-8b4a-0d28ba62296a"). InnerVolumeSpecName "kube-api-access-7tnz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.215704 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tnz9\" (UniqueName: \"kubernetes.io/projected/8a06a640-3954-4a6e-8b4a-0d28ba62296a-kube-api-access-7tnz9\") on node \"crc\" DevicePath \"\"" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.942363 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bce0d864c1d528738219283cfd36eea232e9bfee41122d896a2f7ac9c2f20a5" Oct 07 18:28:57 crc kubenswrapper[4681]: I1007 18:28:57.942419 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lhb2f" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.275344 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-dj7ln"] Oct 07 18:28:58 crc kubenswrapper[4681]: E1007 18:28:58.275743 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a06a640-3954-4a6e-8b4a-0d28ba62296a" containerName="container-00" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.275755 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a06a640-3954-4a6e-8b4a-0d28ba62296a" containerName="container-00" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.275959 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a06a640-3954-4a6e-8b4a-0d28ba62296a" containerName="container-00" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.276520 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.279001 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jwxmq"/"default-dockercfg-whdxj" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.338093 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7f7\" (UniqueName: \"kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.338405 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.440152 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.440331 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.440498 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7f7\" (UniqueName: \"kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.461551 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7f7\" (UniqueName: \"kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7\") pod \"crc-debug-dj7ln\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.593501 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:28:58 crc kubenswrapper[4681]: W1007 18:28:58.642864 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf69c13e_ab90_4eec_bf0d_28ae68a3d16d.slice/crio-6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb WatchSource:0}: Error finding container 6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb: Status 404 returned error can't find the container with id 6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.953407 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" event={"ID":"af69c13e-ab90-4eec-bf0d-28ae68a3d16d","Type":"ContainerStarted","Data":"a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7"} Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.953735 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" event={"ID":"af69c13e-ab90-4eec-bf0d-28ae68a3d16d","Type":"ContainerStarted","Data":"6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb"} Oct 07 18:28:58 crc kubenswrapper[4681]: I1007 18:28:58.974461 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" podStartSLOduration=0.974443451 podStartE2EDuration="974.443451ms" podCreationTimestamp="2025-10-07 18:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 18:28:58.967854519 +0000 UTC m=+5142.615266074" watchObservedRunningTime="2025-10-07 18:28:58.974443451 +0000 UTC m=+5142.621855006" Oct 07 18:28:59 crc kubenswrapper[4681]: I1007 18:28:59.048993 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a06a640-3954-4a6e-8b4a-0d28ba62296a" path="/var/lib/kubelet/pods/8a06a640-3954-4a6e-8b4a-0d28ba62296a/volumes" Oct 07 18:28:59 crc kubenswrapper[4681]: E1007 18:28:59.547943 4681 log.go:32] "ReopenContainerLog from runtime service failed" err="rpc error: code = Unknown desc = container is not running" containerID="a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7" Oct 07 18:28:59 crc kubenswrapper[4681]: E1007 18:28:59.548982 4681 container_log_manager.go:307] "Failed to rotate log for container" err="failed to rotate log \"/var/log/pods/openshift-must-gather-jwxmq_crc-debug-dj7ln_af69c13e-ab90-4eec-bf0d-28ae68a3d16d/container-00/0.log\": failed to reopen container log \"a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7\": rpc error: code = Unknown desc = container is not running" worker=1 containerID="a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7" path="/var/log/pods/openshift-must-gather-jwxmq_crc-debug-dj7ln_af69c13e-ab90-4eec-bf0d-28ae68a3d16d/container-00/0.log" currentSize=73417720 maxSize=52428800 Oct 07 18:28:59 crc kubenswrapper[4681]: I1007 18:28:59.970841 4681 generic.go:334] "Generic (PLEG): container finished" podID="af69c13e-ab90-4eec-bf0d-28ae68a3d16d" containerID="a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7" exitCode=0 Oct 07 18:28:59 crc kubenswrapper[4681]: I1007 18:28:59.970915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" event={"ID":"af69c13e-ab90-4eec-bf0d-28ae68a3d16d","Type":"ContainerDied","Data":"a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7"} Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.102158 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.288838 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j7f7\" (UniqueName: \"kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7\") pod \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.288915 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host\") pod \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\" (UID: \"af69c13e-ab90-4eec-bf0d-28ae68a3d16d\") " Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.289284 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host" (OuterVolumeSpecName: "host") pod "af69c13e-ab90-4eec-bf0d-28ae68a3d16d" (UID: "af69c13e-ab90-4eec-bf0d-28ae68a3d16d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.310998 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7" (OuterVolumeSpecName: "kube-api-access-5j7f7") pod "af69c13e-ab90-4eec-bf0d-28ae68a3d16d" (UID: "af69c13e-ab90-4eec-bf0d-28ae68a3d16d"). InnerVolumeSpecName "kube-api-access-5j7f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.390388 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j7f7\" (UniqueName: \"kubernetes.io/projected/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-kube-api-access-5j7f7\") on node \"crc\" DevicePath \"\"" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.390420 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af69c13e-ab90-4eec-bf0d-28ae68a3d16d-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.990674 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" event={"ID":"af69c13e-ab90-4eec-bf0d-28ae68a3d16d","Type":"ContainerDied","Data":"6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb"} Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.991121 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e0dbe35e0b14bfd4a024d7103c326965848e392d9aeccc65bdc23a2a8bd83cb" Oct 07 18:29:01 crc kubenswrapper[4681]: I1007 18:29:01.991185 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-dj7ln" Oct 07 18:29:06 crc kubenswrapper[4681]: I1007 18:29:06.028953 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:29:06 crc kubenswrapper[4681]: E1007 18:29:06.029735 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:29:07 crc kubenswrapper[4681]: I1007 18:29:07.111702 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-dj7ln"] Oct 07 18:29:07 crc kubenswrapper[4681]: I1007 18:29:07.119776 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-dj7ln"] Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.296194 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lqfhz"] Oct 07 18:29:08 crc kubenswrapper[4681]: E1007 18:29:08.297446 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af69c13e-ab90-4eec-bf0d-28ae68a3d16d" containerName="container-00" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.297521 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="af69c13e-ab90-4eec-bf0d-28ae68a3d16d" containerName="container-00" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.297755 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="af69c13e-ab90-4eec-bf0d-28ae68a3d16d" containerName="container-00" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.298402 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.301486 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jwxmq"/"default-dockercfg-whdxj" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.410107 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjwq\" (UniqueName: \"kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.410201 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.511966 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjwq\" (UniqueName: \"kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.512074 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.512223 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.532340 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjwq\" (UniqueName: \"kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq\") pod \"crc-debug-lqfhz\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:08 crc kubenswrapper[4681]: I1007 18:29:08.616258 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.043943 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af69c13e-ab90-4eec-bf0d-28ae68a3d16d" path="/var/lib/kubelet/pods/af69c13e-ab90-4eec-bf0d-28ae68a3d16d/volumes" Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.067751 4681 generic.go:334] "Generic (PLEG): container finished" podID="80984412-5b95-4301-845f-70212d25bce5" containerID="de2bc48c3c8b2e02e9b6485adfbbeede27bbbca79327dd411fb9ea962157d5ce" exitCode=0 Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.067798 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" event={"ID":"80984412-5b95-4301-845f-70212d25bce5","Type":"ContainerDied","Data":"de2bc48c3c8b2e02e9b6485adfbbeede27bbbca79327dd411fb9ea962157d5ce"} Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.067824 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" event={"ID":"80984412-5b95-4301-845f-70212d25bce5","Type":"ContainerStarted","Data":"163a5efef9023bca8d67d520e1ee42f3c1afc7e84284bd00c5b8f299eb0abcba"} Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.102976 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lqfhz"] Oct 07 18:29:09 crc kubenswrapper[4681]: I1007 18:29:09.111353 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jwxmq/crc-debug-lqfhz"] Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.202721 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.348000 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwjwq\" (UniqueName: \"kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq\") pod \"80984412-5b95-4301-845f-70212d25bce5\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.348249 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host\") pod \"80984412-5b95-4301-845f-70212d25bce5\" (UID: \"80984412-5b95-4301-845f-70212d25bce5\") " Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.348690 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host" (OuterVolumeSpecName: "host") pod "80984412-5b95-4301-845f-70212d25bce5" (UID: "80984412-5b95-4301-845f-70212d25bce5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.362527 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq" (OuterVolumeSpecName: "kube-api-access-pwjwq") pod "80984412-5b95-4301-845f-70212d25bce5" (UID: "80984412-5b95-4301-845f-70212d25bce5"). InnerVolumeSpecName "kube-api-access-pwjwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.450650 4681 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80984412-5b95-4301-845f-70212d25bce5-host\") on node \"crc\" DevicePath \"\"" Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.450684 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwjwq\" (UniqueName: \"kubernetes.io/projected/80984412-5b95-4301-845f-70212d25bce5-kube-api-access-pwjwq\") on node \"crc\" DevicePath \"\"" Oct 07 18:29:10 crc kubenswrapper[4681]: I1007 18:29:10.823440 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.033431 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.040249 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80984412-5b95-4301-845f-70212d25bce5" path="/var/lib/kubelet/pods/80984412-5b95-4301-845f-70212d25bce5/volumes" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.086129 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.086282 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.102438 4681 scope.go:117] "RemoveContainer" containerID="de2bc48c3c8b2e02e9b6485adfbbeede27bbbca79327dd411fb9ea962157d5ce" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.102554 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/crc-debug-lqfhz" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.239071 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/util/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.257436 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/pull/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.320293 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_886e9a2f401cab405bf40ec8285936a2403b9827163bfd463fda01eef0rq8f4_accebcc3-c13d-4dab-bb1b-97f95eb370f3/extract/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.462853 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-dhcz7_fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3/kube-rbac-proxy/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.540944 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-dhcz7_fe9f244f-7a1b-43f2-b1d2-08dcf0454fc3/manager/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.600659 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-c8nzf_c7125c26-53ab-471e-bf33-05265e3f571a/kube-rbac-proxy/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.693908 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-c8nzf_c7125c26-53ab-471e-bf33-05265e3f571a/manager/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.776475 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9qrr7_72f6dfae-3a77-46ad-874b-c94d9059566c/kube-rbac-proxy/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.812594 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-9qrr7_72f6dfae-3a77-46ad-874b-c94d9059566c/manager/0.log" Oct 07 18:29:11 crc kubenswrapper[4681]: I1007 18:29:11.914321 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-rhvt8_049764d0-d62e-4553-9628-3d1b7258d126/kube-rbac-proxy/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.121005 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-rhvt8_049764d0-d62e-4553-9628-3d1b7258d126/manager/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.152128 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-98wq4_c3da478d-c5f4-473c-9848-740845c9adf1/kube-rbac-proxy/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.193906 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-98wq4_c3da478d-c5f4-473c-9848-740845c9adf1/manager/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.367248 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-8qj4n_8e8c5ada-0313-4a16-b9cd-17d39ce932ca/kube-rbac-proxy/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.386239 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-8qj4n_8e8c5ada-0313-4a16-b9cd-17d39ce932ca/manager/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.467355 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-9fwcg_f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4/kube-rbac-proxy/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.583003 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-mt5xr_049a3d2e-6274-44c0-8b56-d19e8d8b1cfc/kube-rbac-proxy/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.756060 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-9fwcg_f5f3e3d1-15b0-46ed-8fbb-bc57c0f4fdf4/manager/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.758207 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-mt5xr_049a3d2e-6274-44c0-8b56-d19e8d8b1cfc/manager/0.log" Oct 07 18:29:12 crc kubenswrapper[4681]: I1007 18:29:12.873275 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8m6q6_bd602c09-19c5-45a7-b8fa-4202e147bbf9/kube-rbac-proxy/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.057122 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8m6q6_bd602c09-19c5-45a7-b8fa-4202e147bbf9/manager/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.069036 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-spqlq_5cc0eff1-427a-4489-8957-f5148e6a0630/kube-rbac-proxy/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.135735 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-spqlq_5cc0eff1-427a-4489-8957-f5148e6a0630/manager/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.254573 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8_720e687c-21aa-4f31-bc4f-7be0f836ec16/kube-rbac-proxy/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.387794 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-ddgm8_720e687c-21aa-4f31-bc4f-7be0f836ec16/manager/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.566645 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-plkk5_52146033-65f3-42f4-b0b8-2b550445305f/kube-rbac-proxy/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.622317 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-plkk5_52146033-65f3-42f4-b0b8-2b550445305f/manager/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.703780 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vh9d7_9c9bc247-6ea6-486c-956c-292930b2c111/kube-rbac-proxy/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.853114 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-vh9d7_9c9bc247-6ea6-486c-956c-292930b2c111/manager/0.log" Oct 07 18:29:13 crc kubenswrapper[4681]: I1007 18:29:13.907322 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-t75k9_c318e2b6-9014-471c-b54d-de14e50a1dfe/kube-rbac-proxy/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.023815 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-t75k9_c318e2b6-9014-471c-b54d-de14e50a1dfe/manager/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.165220 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665clkrns_a7636f86-a942-4f89-bc80-01a3ce70c13e/kube-rbac-proxy/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.203777 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665clkrns_a7636f86-a942-4f89-bc80-01a3ce70c13e/manager/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.392705 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-vqctw_66b31094-5895-41aa-a268-fd2d13990f9f/kube-rbac-proxy/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.565934 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-pv9kh_9e7a0d41-92ad-4dd7-b836-04c049817f6f/kube-rbac-proxy/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.877457 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6687d89476-pv9kh_9e7a0d41-92ad-4dd7-b836-04c049817f6f/operator/0.log" Oct 07 18:29:14 crc kubenswrapper[4681]: I1007 18:29:14.889590 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w4jqq_0613f93f-af7c-4a36-8baa-642a076f5666/registry-server/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.089937 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-v7gh2_10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28/kube-rbac-proxy/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.233116 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-v7gh2_10d09ccc-8bc7-4bf8-8bb4-b5bd1b234b28/manager/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.377664 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-688l4_97a43b61-b120-4613-9b2a-603e1d90878a/kube-rbac-proxy/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.439794 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-688l4_97a43b61-b120-4613-9b2a-603e1d90878a/manager/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.536624 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77dffbdc98-vqctw_66b31094-5895-41aa-a268-fd2d13990f9f/manager/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.625122 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-sq4dv_357d30fc-7c29-4bea-a20a-926b5723bcb0/operator/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.728729 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bxvlr_df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4/kube-rbac-proxy/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.777633 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-bxvlr_df2c4c3b-cd2b-487e-bef4-071d2c9f0eb4/manager/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.836549 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-zhppp_1a3899b1-53e5-413b-b1c1-c7d2f2274b75/kube-rbac-proxy/0.log" Oct 07 18:29:15 crc kubenswrapper[4681]: I1007 18:29:15.929939 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-zhppp_1a3899b1-53e5-413b-b1c1-c7d2f2274b75/manager/0.log" Oct 07 18:29:16 crc kubenswrapper[4681]: I1007 18:29:16.028876 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-svd7g_be78a905-7f1e-4ea1-baf4-5f84246df65f/kube-rbac-proxy/0.log" Oct 07 18:29:16 crc kubenswrapper[4681]: I1007 18:29:16.060378 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-svd7g_be78a905-7f1e-4ea1-baf4-5f84246df65f/manager/0.log" Oct 07 18:29:16 crc kubenswrapper[4681]: I1007 18:29:16.166475 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gn84j_6e4f29f4-5ec2-4476-9153-954cc984443f/kube-rbac-proxy/0.log" Oct 07 18:29:16 crc kubenswrapper[4681]: I1007 18:29:16.189690 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gn84j_6e4f29f4-5ec2-4476-9153-954cc984443f/manager/0.log" Oct 07 18:29:21 crc kubenswrapper[4681]: I1007 18:29:21.029739 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:29:21 crc kubenswrapper[4681]: E1007 18:29:21.030498 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:29:31 crc kubenswrapper[4681]: I1007 18:29:31.408166 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6zl9n_dd5794df-cde0-4881-921f-9ba7006d4281/control-plane-machine-set-operator/0.log" Oct 07 18:29:31 crc kubenswrapper[4681]: I1007 18:29:31.603668 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tg8wr_ba59400b-2ce1-489d-a70d-747f23b176c6/machine-api-operator/0.log" Oct 07 18:29:31 crc kubenswrapper[4681]: I1007 18:29:31.620196 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tg8wr_ba59400b-2ce1-489d-a70d-747f23b176c6/kube-rbac-proxy/0.log" Oct 07 18:29:32 crc kubenswrapper[4681]: I1007 18:29:32.029775 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:29:32 crc kubenswrapper[4681]: E1007 18:29:32.030465 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:29:43 crc kubenswrapper[4681]: I1007 18:29:43.124869 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-5qxkp_e132d85f-c498-41eb-a780-be92455331bb/cert-manager-controller/0.log" Oct 07 18:29:43 crc kubenswrapper[4681]: I1007 18:29:43.259746 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wvq66_b603cb8d-41a5-4537-95da-d2e4fa39ce75/cert-manager-cainjector/0.log" Oct 07 18:29:43 crc kubenswrapper[4681]: I1007 18:29:43.396868 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-djw5h_a8a15de5-2d99-41a6-b4c9-7d31c28413b2/cert-manager-webhook/0.log" Oct 07 18:29:46 crc kubenswrapper[4681]: I1007 18:29:46.029399 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:29:46 crc kubenswrapper[4681]: E1007 18:29:46.029995 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:29:55 crc kubenswrapper[4681]: I1007 18:29:55.620579 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-klc4h_d8e7f096-d849-4c9d-8338-2117a554f2de/nmstate-console-plugin/0.log" Oct 07 18:29:55 crc kubenswrapper[4681]: I1007 18:29:55.957891 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lc2dx_e37df9f5-e512-4e43-9c96-c193553b43dd/nmstate-handler/0.log" Oct 07 18:29:56 crc kubenswrapper[4681]: I1007 18:29:56.015949 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vkkc9_abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7/kube-rbac-proxy/0.log" Oct 07 18:29:56 crc kubenswrapper[4681]: I1007 18:29:56.175474 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vkkc9_abc9d0ca-7b47-4f55-93ff-2f6cfa725fe7/nmstate-metrics/0.log" Oct 07 18:29:56 crc kubenswrapper[4681]: I1007 18:29:56.233076 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-2xfw7_d77c5294-44ad-4618-abf8-143fb7872315/nmstate-operator/0.log" Oct 07 18:29:56 crc kubenswrapper[4681]: I1007 18:29:56.411719 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-7zfgs_64eeb4ec-129d-4fc8-be68-138e9c28cd3c/nmstate-webhook/0.log" Oct 07 18:29:57 crc kubenswrapper[4681]: I1007 18:29:57.035130 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:29:57 crc kubenswrapper[4681]: E1007 18:29:57.035520 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.165438 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh"] Oct 07 18:30:00 crc kubenswrapper[4681]: E1007 18:30:00.166425 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80984412-5b95-4301-845f-70212d25bce5" containerName="container-00" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.166441 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="80984412-5b95-4301-845f-70212d25bce5" containerName="container-00" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.166726 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="80984412-5b95-4301-845f-70212d25bce5" containerName="container-00" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.167541 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.169810 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.170324 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.194255 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh"] Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.304156 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.304204 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.304252 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wkd\" (UniqueName: \"kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.406193 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.406235 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.406279 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wkd\" (UniqueName: \"kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.407146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.417747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.424657 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wkd\" (UniqueName: \"kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd\") pod \"collect-profiles-29331030-9p7mh\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:00 crc kubenswrapper[4681]: I1007 18:30:00.491859 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:01 crc kubenswrapper[4681]: I1007 18:30:01.043369 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh"] Oct 07 18:30:01 crc kubenswrapper[4681]: I1007 18:30:01.540641 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" event={"ID":"754ba69b-5a52-43f2-8e3e-a4f505bd62d7","Type":"ContainerStarted","Data":"608ceb705ffb64363b7079c2ff4658c6161c53125306bcbc61b7fa98e0428ded"} Oct 07 18:30:01 crc kubenswrapper[4681]: I1007 18:30:01.540700 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" event={"ID":"754ba69b-5a52-43f2-8e3e-a4f505bd62d7","Type":"ContainerStarted","Data":"07d7aa2b7fbd284f46a3ac08e11224d1362542e1933ec6bc5ba398f2fa93aaa6"} Oct 07 18:30:02 crc kubenswrapper[4681]: I1007 18:30:02.552406 4681 generic.go:334] "Generic (PLEG): container finished" podID="754ba69b-5a52-43f2-8e3e-a4f505bd62d7" containerID="608ceb705ffb64363b7079c2ff4658c6161c53125306bcbc61b7fa98e0428ded" exitCode=0 Oct 07 18:30:02 crc kubenswrapper[4681]: I1007 18:30:02.552458 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" event={"ID":"754ba69b-5a52-43f2-8e3e-a4f505bd62d7","Type":"ContainerDied","Data":"608ceb705ffb64363b7079c2ff4658c6161c53125306bcbc61b7fa98e0428ded"} Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.035992 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.175368 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume\") pod \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.175820 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume\") pod \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.175859 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wkd\" (UniqueName: \"kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd\") pod \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\" (UID: \"754ba69b-5a52-43f2-8e3e-a4f505bd62d7\") " Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.176771 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "754ba69b-5a52-43f2-8e3e-a4f505bd62d7" (UID: "754ba69b-5a52-43f2-8e3e-a4f505bd62d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.177162 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.196088 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "754ba69b-5a52-43f2-8e3e-a4f505bd62d7" (UID: "754ba69b-5a52-43f2-8e3e-a4f505bd62d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.196286 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd" (OuterVolumeSpecName: "kube-api-access-44wkd") pod "754ba69b-5a52-43f2-8e3e-a4f505bd62d7" (UID: "754ba69b-5a52-43f2-8e3e-a4f505bd62d7"). InnerVolumeSpecName "kube-api-access-44wkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.279051 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.279090 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wkd\" (UniqueName: \"kubernetes.io/projected/754ba69b-5a52-43f2-8e3e-a4f505bd62d7-kube-api-access-44wkd\") on node \"crc\" DevicePath \"\"" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.569611 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" event={"ID":"754ba69b-5a52-43f2-8e3e-a4f505bd62d7","Type":"ContainerDied","Data":"07d7aa2b7fbd284f46a3ac08e11224d1362542e1933ec6bc5ba398f2fa93aaa6"} Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.569646 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d7aa2b7fbd284f46a3ac08e11224d1362542e1933ec6bc5ba398f2fa93aaa6" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.569698 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29331030-9p7mh" Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.644557 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz"] Oct 07 18:30:04 crc kubenswrapper[4681]: I1007 18:30:04.652373 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330985-mj5jz"] Oct 07 18:30:05 crc kubenswrapper[4681]: I1007 18:30:05.042220 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f5432a-f229-4146-93f1-d053b31f680a" path="/var/lib/kubelet/pods/e1f5432a-f229-4146-93f1-d053b31f680a/volumes" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.029538 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:30:12 crc kubenswrapper[4681]: E1007 18:30:12.030278 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.202760 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xszcp_ec498aeb-7c28-4e30-adee-e4546d01d498/kube-rbac-proxy/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.295713 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xszcp_ec498aeb-7c28-4e30-adee-e4546d01d498/controller/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.448526 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.644780 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.715273 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.717481 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.743208 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.889272 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.928809 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.943638 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:30:12 crc kubenswrapper[4681]: I1007 18:30:12.965915 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.183236 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-metrics/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.187614 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-frr-files/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.193076 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/cp-reloader/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.222954 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/controller/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.378334 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/kube-rbac-proxy/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.423469 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/frr-metrics/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.458490 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/kube-rbac-proxy-frr/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.694181 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/reloader/0.log" Oct 07 18:30:13 crc kubenswrapper[4681]: I1007 18:30:13.743378 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zcsl2_0227af93-e3dc-47c9-b6ce-57d25fc998ea/frr-k8s-webhook-server/0.log" Oct 07 18:30:14 crc kubenswrapper[4681]: I1007 18:30:14.041780 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-598476574-wb9sj_abb48906-478a-4687-9e03-76d9035242b8/manager/0.log" Oct 07 18:30:14 crc kubenswrapper[4681]: I1007 18:30:14.176994 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5996f7f8c8-n6prk_350e1b80-5296-4a5b-a604-e9a42b56cbd1/webhook-server/0.log" Oct 07 18:30:14 crc kubenswrapper[4681]: I1007 18:30:14.353585 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nzg7f_3f34c830-b1bc-433a-af20-0db4f0d96394/kube-rbac-proxy/0.log" Oct 07 18:30:15 crc kubenswrapper[4681]: I1007 18:30:15.028288 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nzg7f_3f34c830-b1bc-433a-af20-0db4f0d96394/speaker/0.log" Oct 07 18:30:15 crc kubenswrapper[4681]: I1007 18:30:15.057502 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7f48v_45c71c49-f633-48af-a495-a1bdf06d66b9/frr/0.log" Oct 07 18:30:23 crc kubenswrapper[4681]: I1007 18:30:23.029768 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:30:23 crc kubenswrapper[4681]: E1007 18:30:23.030552 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.021980 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.242186 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.252387 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.267085 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.508726 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/util/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.509751 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/extract/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.512697 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r66sj_9112c3b1-a90a-48d2-9282-cd9f4c055d39/pull/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.686411 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.844709 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.900226 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:30:26 crc kubenswrapper[4681]: I1007 18:30:26.906624 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.166136 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-content/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.232573 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/extract-utilities/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.333517 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dxw5l_7bd54664-a4e4-4dce-9bb9-bb9f0b9ee92d/registry-server/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.422610 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.657261 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.681055 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.710698 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.875027 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-content/0.log" Oct 07 18:30:27 crc kubenswrapper[4681]: I1007 18:30:27.902969 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/extract-utilities/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.196580 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.465127 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.472437 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.577697 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.754069 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ntpwx_b81cacc9-a147-4bfe-887d-b7bfbd0ca3ee/registry-server/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.782527 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/extract/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.802774 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/pull/0.log" Oct 07 18:30:28 crc kubenswrapper[4681]: I1007 18:30:28.886460 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5c4dn_d7efbe4b-6c4e-4597-a08a-c65043f2466a/util/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.083542 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kbm6c_4bf28f5b-4ee1-444b-ad73-7d63ecbd05c9/marketplace-operator/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.110511 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.333556 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.368579 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.368843 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.560093 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-utilities/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.587396 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/extract-content/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.736168 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jvq9k_fb7e45fe-c863-485b-a67b-133a94f0a533/registry-server/0.log" Oct 07 18:30:29 crc kubenswrapper[4681]: I1007 18:30:29.857228 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.023585 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.050367 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.059547 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.202730 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-utilities/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.246508 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/extract-content/0.log" Oct 07 18:30:30 crc kubenswrapper[4681]: I1007 18:30:30.691651 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2vsqz_9cc90449-f49f-4406-8af2-882d7e19b3f4/registry-server/0.log" Oct 07 18:30:37 crc kubenswrapper[4681]: I1007 18:30:37.719403 4681 scope.go:117] "RemoveContainer" containerID="06573a68eacfcfce60cff64ce45d16aac7bd144443fceb9d68cd78b91fda4c4b" Oct 07 18:30:38 crc kubenswrapper[4681]: I1007 18:30:38.029479 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:30:38 crc kubenswrapper[4681]: E1007 18:30:38.030245 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:30:51 crc kubenswrapper[4681]: I1007 18:30:51.029700 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:30:51 crc kubenswrapper[4681]: E1007 18:30:51.030495 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:31:01 crc kubenswrapper[4681]: E1007 18:31:01.686865 4681 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.93:48728->38.129.56.93:44823: read tcp 38.129.56.93:48728->38.129.56.93:44823: read: connection reset by peer Oct 07 18:31:03 crc kubenswrapper[4681]: I1007 18:31:03.029843 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:31:03 crc kubenswrapper[4681]: E1007 18:31:03.030502 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8z5w6_openshift-machine-config-operator(0888bed1-620e-4a75-bcf8-460b4cd280ea)\"" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" Oct 07 18:31:16 crc kubenswrapper[4681]: I1007 18:31:16.029302 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:31:17 crc kubenswrapper[4681]: I1007 18:31:17.205910 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"1cdf0cb28dd02a2ebb3c0a6f0bb22f5afd48841d5bba591bfff927634e0bc73a"} Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.183247 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:26 crc kubenswrapper[4681]: E1007 18:31:26.184299 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754ba69b-5a52-43f2-8e3e-a4f505bd62d7" containerName="collect-profiles" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.184321 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="754ba69b-5a52-43f2-8e3e-a4f505bd62d7" containerName="collect-profiles" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.184555 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="754ba69b-5a52-43f2-8e3e-a4f505bd62d7" containerName="collect-profiles" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.185952 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.209154 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.266934 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.267003 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rznxr\" (UniqueName: \"kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.267127 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.369448 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.369829 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rznxr\" (UniqueName: \"kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.369975 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.370078 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.370418 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.394077 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rznxr\" (UniqueName: \"kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr\") pod \"community-operators-l9grq\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:26 crc kubenswrapper[4681]: I1007 18:31:26.514100 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:27 crc kubenswrapper[4681]: I1007 18:31:27.155050 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:27 crc kubenswrapper[4681]: I1007 18:31:27.322086 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerStarted","Data":"b71050bd8256b923af25561a66e123562926c006a6bdd04bb32e42e6b6062725"} Oct 07 18:31:28 crc kubenswrapper[4681]: I1007 18:31:28.334592 4681 generic.go:334] "Generic (PLEG): container finished" podID="ad104b60-6fde-4200-a943-0d91730d264c" containerID="a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698" exitCode=0 Oct 07 18:31:28 crc kubenswrapper[4681]: I1007 18:31:28.334643 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerDied","Data":"a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698"} Oct 07 18:31:28 crc kubenswrapper[4681]: I1007 18:31:28.337489 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 18:31:30 crc kubenswrapper[4681]: I1007 18:31:30.351919 4681 generic.go:334] "Generic (PLEG): container finished" podID="ad104b60-6fde-4200-a943-0d91730d264c" containerID="6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc" exitCode=0 Oct 07 18:31:30 crc kubenswrapper[4681]: I1007 18:31:30.353495 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerDied","Data":"6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc"} Oct 07 18:31:31 crc kubenswrapper[4681]: I1007 18:31:31.362236 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerStarted","Data":"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6"} Oct 07 18:31:31 crc kubenswrapper[4681]: I1007 18:31:31.381492 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9grq" podStartSLOduration=2.940095697 podStartE2EDuration="5.381469689s" podCreationTimestamp="2025-10-07 18:31:26 +0000 UTC" firstStartedPulling="2025-10-07 18:31:28.33719296 +0000 UTC m=+5291.984604525" lastFinishedPulling="2025-10-07 18:31:30.778566962 +0000 UTC m=+5294.425978517" observedRunningTime="2025-10-07 18:31:31.377696823 +0000 UTC m=+5295.025108388" watchObservedRunningTime="2025-10-07 18:31:31.381469689 +0000 UTC m=+5295.028881244" Oct 07 18:31:36 crc kubenswrapper[4681]: I1007 18:31:36.515118 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:36 crc kubenswrapper[4681]: I1007 18:31:36.516740 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:36 crc kubenswrapper[4681]: I1007 18:31:36.564798 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:37 crc kubenswrapper[4681]: I1007 18:31:37.455988 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:37 crc kubenswrapper[4681]: I1007 18:31:37.546303 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:39 crc kubenswrapper[4681]: I1007 18:31:39.426905 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l9grq" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="registry-server" containerID="cri-o://a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6" gracePeriod=2 Oct 07 18:31:39 crc kubenswrapper[4681]: I1007 18:31:39.930360 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.035632 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rznxr\" (UniqueName: \"kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr\") pod \"ad104b60-6fde-4200-a943-0d91730d264c\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.035780 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities\") pod \"ad104b60-6fde-4200-a943-0d91730d264c\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.035840 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content\") pod \"ad104b60-6fde-4200-a943-0d91730d264c\" (UID: \"ad104b60-6fde-4200-a943-0d91730d264c\") " Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.037126 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities" (OuterVolumeSpecName: "utilities") pod "ad104b60-6fde-4200-a943-0d91730d264c" (UID: "ad104b60-6fde-4200-a943-0d91730d264c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.041450 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr" (OuterVolumeSpecName: "kube-api-access-rznxr") pod "ad104b60-6fde-4200-a943-0d91730d264c" (UID: "ad104b60-6fde-4200-a943-0d91730d264c"). InnerVolumeSpecName "kube-api-access-rznxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.092847 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad104b60-6fde-4200-a943-0d91730d264c" (UID: "ad104b60-6fde-4200-a943-0d91730d264c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.138036 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rznxr\" (UniqueName: \"kubernetes.io/projected/ad104b60-6fde-4200-a943-0d91730d264c-kube-api-access-rznxr\") on node \"crc\" DevicePath \"\"" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.138069 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.138079 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad104b60-6fde-4200-a943-0d91730d264c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.436961 4681 generic.go:334] "Generic (PLEG): container finished" podID="ad104b60-6fde-4200-a943-0d91730d264c" containerID="a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6" exitCode=0 Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.436999 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerDied","Data":"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6"} Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.437023 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9grq" event={"ID":"ad104b60-6fde-4200-a943-0d91730d264c","Type":"ContainerDied","Data":"b71050bd8256b923af25561a66e123562926c006a6bdd04bb32e42e6b6062725"} Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.437038 4681 scope.go:117] "RemoveContainer" containerID="a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.437123 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9grq" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.471122 4681 scope.go:117] "RemoveContainer" containerID="6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.483307 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.493997 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l9grq"] Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.498438 4681 scope.go:117] "RemoveContainer" containerID="a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.554171 4681 scope.go:117] "RemoveContainer" containerID="a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6" Oct 07 18:31:40 crc kubenswrapper[4681]: E1007 18:31:40.554742 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6\": container with ID starting with a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6 not found: ID does not exist" containerID="a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.554782 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6"} err="failed to get container status \"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6\": rpc error: code = NotFound desc = could not find container \"a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6\": container with ID starting with a1f8332b2621cd890bd023fefbf407daf5fa731e4756442c5bdaeb2398d3c0b6 not found: ID does not exist" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.554817 4681 scope.go:117] "RemoveContainer" containerID="6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc" Oct 07 18:31:40 crc kubenswrapper[4681]: E1007 18:31:40.555329 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc\": container with ID starting with 6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc not found: ID does not exist" containerID="6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.555406 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc"} err="failed to get container status \"6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc\": rpc error: code = NotFound desc = could not find container \"6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc\": container with ID starting with 6071570b2f44db55af18941d17eb40656c9b7c9caf6d63a0690ada84ba39dbfc not found: ID does not exist" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.555448 4681 scope.go:117] "RemoveContainer" containerID="a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698" Oct 07 18:31:40 crc kubenswrapper[4681]: E1007 18:31:40.555938 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698\": container with ID starting with a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698 not found: ID does not exist" containerID="a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698" Oct 07 18:31:40 crc kubenswrapper[4681]: I1007 18:31:40.555966 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698"} err="failed to get container status \"a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698\": rpc error: code = NotFound desc = could not find container \"a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698\": container with ID starting with a0e5629cd237d8f7a7b9ea56bdecbb48c9937fd8c41a6c6d94338dcb859fa698 not found: ID does not exist" Oct 07 18:31:41 crc kubenswrapper[4681]: I1007 18:31:41.039935 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad104b60-6fde-4200-a943-0d91730d264c" path="/var/lib/kubelet/pods/ad104b60-6fde-4200-a943-0d91730d264c/volumes" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.396372 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:31:56 crc kubenswrapper[4681]: E1007 18:31:56.397228 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="extract-utilities" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.397239 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="extract-utilities" Oct 07 18:31:56 crc kubenswrapper[4681]: E1007 18:31:56.397253 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="extract-content" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.397259 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="extract-content" Oct 07 18:31:56 crc kubenswrapper[4681]: E1007 18:31:56.397282 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="registry-server" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.397287 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="registry-server" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.397443 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad104b60-6fde-4200-a943-0d91730d264c" containerName="registry-server" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.398715 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.454588 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.557555 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr44j\" (UniqueName: \"kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.557643 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.557763 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.659515 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.659713 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.659745 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr44j\" (UniqueName: \"kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.660173 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.660232 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.679512 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr44j\" (UniqueName: \"kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j\") pod \"redhat-operators-sdszr\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:56 crc kubenswrapper[4681]: I1007 18:31:56.719902 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:31:57 crc kubenswrapper[4681]: I1007 18:31:57.065484 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:31:57 crc kubenswrapper[4681]: I1007 18:31:57.591336 4681 generic.go:334] "Generic (PLEG): container finished" podID="a37a7687-4ff6-4607-bb04-defa3bd91be5" containerID="85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153" exitCode=0 Oct 07 18:31:57 crc kubenswrapper[4681]: I1007 18:31:57.591417 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerDied","Data":"85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153"} Oct 07 18:31:57 crc kubenswrapper[4681]: I1007 18:31:57.591647 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerStarted","Data":"a73d87e2ac0a0fb48a4f7949caddc571a8ad0eb1d5bc4f387f4978f301a80fda"} Oct 07 18:31:58 crc kubenswrapper[4681]: I1007 18:31:58.603058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerStarted","Data":"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b"} Oct 07 18:32:02 crc kubenswrapper[4681]: I1007 18:32:02.640344 4681 generic.go:334] "Generic (PLEG): container finished" podID="a37a7687-4ff6-4607-bb04-defa3bd91be5" containerID="0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b" exitCode=0 Oct 07 18:32:02 crc kubenswrapper[4681]: I1007 18:32:02.640401 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerDied","Data":"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b"} Oct 07 18:32:03 crc kubenswrapper[4681]: I1007 18:32:03.650703 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerStarted","Data":"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22"} Oct 07 18:32:06 crc kubenswrapper[4681]: I1007 18:32:06.720732 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:06 crc kubenswrapper[4681]: I1007 18:32:06.721093 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:07 crc kubenswrapper[4681]: I1007 18:32:07.772531 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sdszr" podUID="a37a7687-4ff6-4607-bb04-defa3bd91be5" containerName="registry-server" probeResult="failure" output=< Oct 07 18:32:07 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Oct 07 18:32:07 crc kubenswrapper[4681]: > Oct 07 18:32:16 crc kubenswrapper[4681]: I1007 18:32:16.778457 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:16 crc kubenswrapper[4681]: I1007 18:32:16.808964 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sdszr" podStartSLOduration=15.362599996 podStartE2EDuration="20.808938305s" podCreationTimestamp="2025-10-07 18:31:56 +0000 UTC" firstStartedPulling="2025-10-07 18:31:57.593179407 +0000 UTC m=+5321.240590952" lastFinishedPulling="2025-10-07 18:32:03.039517706 +0000 UTC m=+5326.686929261" observedRunningTime="2025-10-07 18:32:03.677768496 +0000 UTC m=+5327.325180051" watchObservedRunningTime="2025-10-07 18:32:16.808938305 +0000 UTC m=+5340.456349860" Oct 07 18:32:16 crc kubenswrapper[4681]: I1007 18:32:16.838473 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:17 crc kubenswrapper[4681]: I1007 18:32:17.025522 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:32:18 crc kubenswrapper[4681]: I1007 18:32:18.789580 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sdszr" podUID="a37a7687-4ff6-4607-bb04-defa3bd91be5" containerName="registry-server" containerID="cri-o://63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22" gracePeriod=2 Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.232591 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.411491 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr44j\" (UniqueName: \"kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j\") pod \"a37a7687-4ff6-4607-bb04-defa3bd91be5\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.412209 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities\") pod \"a37a7687-4ff6-4607-bb04-defa3bd91be5\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.412471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content\") pod \"a37a7687-4ff6-4607-bb04-defa3bd91be5\" (UID: \"a37a7687-4ff6-4607-bb04-defa3bd91be5\") " Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.417125 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities" (OuterVolumeSpecName: "utilities") pod "a37a7687-4ff6-4607-bb04-defa3bd91be5" (UID: "a37a7687-4ff6-4607-bb04-defa3bd91be5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.427544 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j" (OuterVolumeSpecName: "kube-api-access-tr44j") pod "a37a7687-4ff6-4607-bb04-defa3bd91be5" (UID: "a37a7687-4ff6-4607-bb04-defa3bd91be5"). InnerVolumeSpecName "kube-api-access-tr44j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.501438 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a37a7687-4ff6-4607-bb04-defa3bd91be5" (UID: "a37a7687-4ff6-4607-bb04-defa3bd91be5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.514958 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr44j\" (UniqueName: \"kubernetes.io/projected/a37a7687-4ff6-4607-bb04-defa3bd91be5-kube-api-access-tr44j\") on node \"crc\" DevicePath \"\"" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.515277 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.515429 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a37a7687-4ff6-4607-bb04-defa3bd91be5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.800503 4681 generic.go:334] "Generic (PLEG): container finished" podID="a37a7687-4ff6-4607-bb04-defa3bd91be5" containerID="63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22" exitCode=0 Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.800544 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerDied","Data":"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22"} Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.800572 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdszr" event={"ID":"a37a7687-4ff6-4607-bb04-defa3bd91be5","Type":"ContainerDied","Data":"a73d87e2ac0a0fb48a4f7949caddc571a8ad0eb1d5bc4f387f4978f301a80fda"} Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.800587 4681 scope.go:117] "RemoveContainer" containerID="63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.800701 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdszr" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.839978 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.851034 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sdszr"] Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.857970 4681 scope.go:117] "RemoveContainer" containerID="0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.878478 4681 scope.go:117] "RemoveContainer" containerID="85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.915061 4681 scope.go:117] "RemoveContainer" containerID="63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22" Oct 07 18:32:19 crc kubenswrapper[4681]: E1007 18:32:19.915573 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22\": container with ID starting with 63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22 not found: ID does not exist" containerID="63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.915611 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22"} err="failed to get container status \"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22\": rpc error: code = NotFound desc = could not find container \"63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22\": container with ID starting with 63dc2ce621c12265d8d74d5272c267e9a0e6405a475ecab60bd75231f8ab4f22 not found: ID does not exist" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.915629 4681 scope.go:117] "RemoveContainer" containerID="0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b" Oct 07 18:32:19 crc kubenswrapper[4681]: E1007 18:32:19.916215 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b\": container with ID starting with 0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b not found: ID does not exist" containerID="0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.916239 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b"} err="failed to get container status \"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b\": rpc error: code = NotFound desc = could not find container \"0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b\": container with ID starting with 0f504c44f88312c35b502a61ae8ec82cb6494d64d76f3af036b2aa2ab90c295b not found: ID does not exist" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.916253 4681 scope.go:117] "RemoveContainer" containerID="85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153" Oct 07 18:32:19 crc kubenswrapper[4681]: E1007 18:32:19.916559 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153\": container with ID starting with 85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153 not found: ID does not exist" containerID="85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153" Oct 07 18:32:19 crc kubenswrapper[4681]: I1007 18:32:19.916605 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153"} err="failed to get container status \"85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153\": rpc error: code = NotFound desc = could not find container \"85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153\": container with ID starting with 85804bf072e69a08337bce57bb6bb80e93a3b87e0913873b39b464e159995153 not found: ID does not exist" Oct 07 18:32:21 crc kubenswrapper[4681]: I1007 18:32:21.040035 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37a7687-4ff6-4607-bb04-defa3bd91be5" path="/var/lib/kubelet/pods/a37a7687-4ff6-4607-bb04-defa3bd91be5/volumes" Oct 07 18:32:37 crc kubenswrapper[4681]: I1007 18:32:37.797968 4681 scope.go:117] "RemoveContainer" containerID="257f67f8a991d37d096cd20dbcd01a708837d13ee11d1fae9c7016f45616e69b" Oct 07 18:33:06 crc kubenswrapper[4681]: I1007 18:33:06.198311 4681 generic.go:334] "Generic (PLEG): container finished" podID="67296e11-c9cb-4eb7-bcf7-26822676668b" containerID="a9b6007036794d95dff3894fdc629aaa2a7846d0d89a3c7bfe2ab6bf0ce5a983" exitCode=0 Oct 07 18:33:06 crc kubenswrapper[4681]: I1007 18:33:06.198419 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" event={"ID":"67296e11-c9cb-4eb7-bcf7-26822676668b","Type":"ContainerDied","Data":"a9b6007036794d95dff3894fdc629aaa2a7846d0d89a3c7bfe2ab6bf0ce5a983"} Oct 07 18:33:06 crc kubenswrapper[4681]: I1007 18:33:06.199533 4681 scope.go:117] "RemoveContainer" containerID="a9b6007036794d95dff3894fdc629aaa2a7846d0d89a3c7bfe2ab6bf0ce5a983" Oct 07 18:33:06 crc kubenswrapper[4681]: I1007 18:33:06.319691 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jwxmq_must-gather-hgbpv_67296e11-c9cb-4eb7-bcf7-26822676668b/gather/0.log" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.072163 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jwxmq/must-gather-hgbpv"] Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.072927 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" podUID="67296e11-c9cb-4eb7-bcf7-26822676668b" containerName="copy" containerID="cri-o://30fd71733d1c132f18cfb3e4dea15a34fc30918c31ee8392fc87040ee459cb53" gracePeriod=2 Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.080759 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jwxmq/must-gather-hgbpv"] Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.340189 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jwxmq_must-gather-hgbpv_67296e11-c9cb-4eb7-bcf7-26822676668b/copy/0.log" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.341649 4681 generic.go:334] "Generic (PLEG): container finished" podID="67296e11-c9cb-4eb7-bcf7-26822676668b" containerID="30fd71733d1c132f18cfb3e4dea15a34fc30918c31ee8392fc87040ee459cb53" exitCode=143 Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.471511 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jwxmq_must-gather-hgbpv_67296e11-c9cb-4eb7-bcf7-26822676668b/copy/0.log" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.472324 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.602277 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output\") pod \"67296e11-c9cb-4eb7-bcf7-26822676668b\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.602428 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcdjg\" (UniqueName: \"kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg\") pod \"67296e11-c9cb-4eb7-bcf7-26822676668b\" (UID: \"67296e11-c9cb-4eb7-bcf7-26822676668b\") " Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.610157 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg" (OuterVolumeSpecName: "kube-api-access-qcdjg") pod "67296e11-c9cb-4eb7-bcf7-26822676668b" (UID: "67296e11-c9cb-4eb7-bcf7-26822676668b"). InnerVolumeSpecName "kube-api-access-qcdjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.704778 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcdjg\" (UniqueName: \"kubernetes.io/projected/67296e11-c9cb-4eb7-bcf7-26822676668b-kube-api-access-qcdjg\") on node \"crc\" DevicePath \"\"" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.811559 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "67296e11-c9cb-4eb7-bcf7-26822676668b" (UID: "67296e11-c9cb-4eb7-bcf7-26822676668b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 18:33:21 crc kubenswrapper[4681]: I1007 18:33:21.909107 4681 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67296e11-c9cb-4eb7-bcf7-26822676668b-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 18:33:22 crc kubenswrapper[4681]: I1007 18:33:22.353410 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jwxmq_must-gather-hgbpv_67296e11-c9cb-4eb7-bcf7-26822676668b/copy/0.log" Oct 07 18:33:22 crc kubenswrapper[4681]: I1007 18:33:22.356176 4681 scope.go:117] "RemoveContainer" containerID="30fd71733d1c132f18cfb3e4dea15a34fc30918c31ee8392fc87040ee459cb53" Oct 07 18:33:22 crc kubenswrapper[4681]: I1007 18:33:22.356362 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jwxmq/must-gather-hgbpv" Oct 07 18:33:22 crc kubenswrapper[4681]: I1007 18:33:22.389282 4681 scope.go:117] "RemoveContainer" containerID="a9b6007036794d95dff3894fdc629aaa2a7846d0d89a3c7bfe2ab6bf0ce5a983" Oct 07 18:33:23 crc kubenswrapper[4681]: I1007 18:33:23.038497 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67296e11-c9cb-4eb7-bcf7-26822676668b" path="/var/lib/kubelet/pods/67296e11-c9cb-4eb7-bcf7-26822676668b/volumes" Oct 07 18:33:42 crc kubenswrapper[4681]: I1007 18:33:42.195480 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:33:42 crc kubenswrapper[4681]: I1007 18:33:42.196006 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:34:12 crc kubenswrapper[4681]: I1007 18:34:12.195316 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:34:12 crc kubenswrapper[4681]: I1007 18:34:12.195804 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:34:42 crc kubenswrapper[4681]: I1007 18:34:42.195257 4681 patch_prober.go:28] interesting pod/machine-config-daemon-8z5w6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 18:34:42 crc kubenswrapper[4681]: I1007 18:34:42.196023 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 18:34:42 crc kubenswrapper[4681]: I1007 18:34:42.196095 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" Oct 07 18:34:42 crc kubenswrapper[4681]: I1007 18:34:42.197019 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1cdf0cb28dd02a2ebb3c0a6f0bb22f5afd48841d5bba591bfff927634e0bc73a"} pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 18:34:42 crc kubenswrapper[4681]: I1007 18:34:42.197106 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" podUID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerName="machine-config-daemon" containerID="cri-o://1cdf0cb28dd02a2ebb3c0a6f0bb22f5afd48841d5bba591bfff927634e0bc73a" gracePeriod=600 Oct 07 18:34:43 crc kubenswrapper[4681]: I1007 18:34:43.076610 4681 generic.go:334] "Generic (PLEG): container finished" podID="0888bed1-620e-4a75-bcf8-460b4cd280ea" containerID="1cdf0cb28dd02a2ebb3c0a6f0bb22f5afd48841d5bba591bfff927634e0bc73a" exitCode=0 Oct 07 18:34:43 crc kubenswrapper[4681]: I1007 18:34:43.076696 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerDied","Data":"1cdf0cb28dd02a2ebb3c0a6f0bb22f5afd48841d5bba591bfff927634e0bc73a"} Oct 07 18:34:43 crc kubenswrapper[4681]: I1007 18:34:43.077457 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8z5w6" event={"ID":"0888bed1-620e-4a75-bcf8-460b4cd280ea","Type":"ContainerStarted","Data":"1d931a72fa9c605e45a68d794470539ac2bbc08bc0fa6a37ece8cb2e216b0808"} Oct 07 18:34:43 crc kubenswrapper[4681]: I1007 18:34:43.077487 4681 scope.go:117] "RemoveContainer" containerID="84c45b36e2ed41c4b495e2fd580ea0e5167ec3d76823c26d7b03779acd266822" Oct 07 18:35:37 crc kubenswrapper[4681]: I1007 18:35:37.942116 4681 scope.go:117] "RemoveContainer" containerID="a821c10810f9e44290fb344684c3fe72e7c1c10016bb0d2ab0d3470421d384e7"